Ubuntu – Long Term Support

One item of particular interest with Ubuntu is their development schedule. Because a typical Linux distribution is composed of many applications from many different parties, the Ubuntu developers do not directly control or develop a lot of the software included in Ubuntu. Furthermore Ubuntu tries to be a complete desktop environment rather than just an operating system, which means it includes a wider variety of software than what’s found in Windows and Mac OS X.

What this amounts to is that Ubuntu needs to both provide future patch support for included applications, and it needs to compensate for the fact that they don’t develop many of these programs. Coupled with this is the fact that 2nd party application development is not necessarily synchronized to Ubuntu’s release schedule and some applications (and the kernel itself) can have a rather rapid development rate.

Trying to deal with all of these factors, Ubuntu has settled on two classes of releases. Every 6 months – in October and April – Ubuntu takes what’s ready and releases a new version of the OS. For 1st party material this is tied with some goal for the release (such as replacing the audio daemon) while for 3rd party software this may be as simple as grabbing the latest version. This puts regular Ubuntu versions in an unusual position when trying to classify them – it’s significantly more than a Mac OS X point update, still more than a Windows service pack, and yet a single release generally encompasses less than a new version of either OS. But at the same time, there’s no guarantee that any given release of Ubuntu won’t break software compatibility or binary driver compatibility, which puts it up there with major OS releases.

Furthermore because of the need to provide security updates for all these different programs in all of these different versions, Ubuntu has a very short support cycle, and in that cycle only bug fixes and security updates will be issued, software is not otherwise changed as it’s intended to represent a stable platform. A regular release is only supported for 1.5 years; which for example means support for 7.10 Gutsy, the immediate predecessor to 8.04 Hardy Heron, expired in April. This pushes new versions of Ubuntu back towards the idea of them being closer to a service pack or a point release. In essence, it’s intended that everyone using regular versions of Ubuntu will stick to a relatively rapid upgrade treadmill.

But this obviously doesn’t work for everyone, which results in there being two classes of Ubuntu. What we’re looking at today, 8.04, is what Ubuntu calls a long term support (LTS) release. Every 2 years a version of Ubuntu is labeled as a LTS release, which entails a much greater effort on the developer’s part to support that edition of the OS. The standard support period is 3 years instead of 1.5 years, and for the server edition of the OS that becomes 5 years.

This makes the LTS releases more comparable to Mac OS X and Windows, both of which have long support periods in excess of 3 years. This is also why we’re starting with a review of Hardy, in spite of it being over a year old now, because it’s the current LTS release. Regular short-support Ubuntu releases have their place, but they are not intended for long-term use. Coming from Windows or Mac OS X, a LTS release is the comparable equivalent.

Operating System Mainstream Support Extended Support
Windows 5 years 5 additional years
Ubuntu 1.5 years None
Ubuntu LTS 3 years None
Mac OS X So long as it's the newest OS So long as it's one version behind

Unfortunately, in spite of the LTS designation, not all of the applications in a LTS release are intended to be used for such a long period of time, or are their developers willing to support them for that length of time. If we take Firefox for example, the last Ubuntu LTS release, 6.06 Dapper, shipped with Firefox 1.5. Mozilla very quickly ended support for Firefox 1.xx after Firefox 2 shipped, and now you can’t even get support for 2.xx now that 3.xx has been out for quite some time. This leaves the Ubuntu developers in charge of supplying security updates for the older versions of Firefox they still support, which while better than the alternative (no security patches) isn’t necessarily a great solution.

The Ubuntu developers have done a good job of staying on top of the matter (they just published a new 1.5 security patch as recently as last month) but it highlights the fact that the Ubuntu developers do not always have the resources to maintain both a stable platform and the necessary security updates. So while an LTS release is supposed to be supported for 3 years, in reality not every component is going to make it that long.

Digging through the bugs list for Dapper and Hardy, I get the impression that these kinds of cracks only occur on less-used software (particularly that which is not part of the default install, such as VLC), so an option for users who need to stick with the base OS for the entire life of a LTS release, but don’t mind upgrading a few applications can go that route and cover all of their bases. Unfortunately this is easier said than done, and we’ll get to why that is when we discuss the package manager.

What this amounts to is that if you’re the kind of person that intends to run a computer and an OS for a very long period of time – say on the scale of XP, which turns 8 this year – Ubuntu likely isn’t a good fit for you.

It’s Secure What’s the Value of Technical Support, Anyhow?
Comments Locked

195 Comments

View All Comments

  • viciki123 - Monday, February 22, 2010 - link

    I have a new website!wedding dresses uk:http://www.weddingdressonlineshop.co.uk">http://www.weddingdressonlineshop.co.uk
  • Jillian - Thursday, December 29, 2011 - link

    YOu are spamer .For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities. http://www.idresses.co.uk
  • zerobug - Monday, February 1, 2010 - link

    Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
    So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
    For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
  • MarcusAsleep - Thursday, December 17, 2009 - link

    Quick Startup: OK, Windows is fast - at first, well let's say that is if you install it yourself without all the bloatware that come standard on Windows store-bought PS's (we bought a Toshiba laptop for work with Vista that took 12 minutes after boot-up for it to respond to a click on the start menu - even on the third time booting.)

    Windows startup is often burdened by auto-updates from Microsoft, anti-virus, Sun-Java, Acrobat Reader, etc. etc. that slow down the computer on boot-up to where your original idea of "hey I just want to start my computer and check my email for a minute before work" can take at least 5. I can do this reliably on Linux in 1. Yes, if you know a lot about Windows, you can stop all the auto-updates and maintain them yourself but 99% of Windows users don't have time/or know how to do this.

    Trouble-free: E.G. I installed Linux on a computer for my wife's parents (Mepis Linux) 7 years ago for email, pictures, games, web, letter use and haven't had to touch it since then. This is typical.

    For Windows, often I have done fresh installs on trojan/virus infected computers - installed working antivirus and all Windows Updates (not to mention this process takes about 2-4 hours of updates upon updates + downloads of the proper drives from the manufacturers websites vs about 1 hour for an Ubuntu install with all updates done including any extra work for codecs and graphic drivers) - only to have to come back a couple months later to a slow again computer from users installing adware, infected games, etc.

    Free: Nearly every Windows reinstall I've had to do starts with a computer loaded with MS Office, games, etc. but upon reinstall nobody has the disks for these. There is a lot of "sharing" of computer programs in the Windows world that is not very honest

    With Linux, you can have the operating system plus pretty much anything else you would need, without having to cheat.

    Adaptable Performance: You can get a well-performing Linux installation (LXDE Desktop) on a PIII computer with 256MB of ram. The only thing that will seem slow to an average mom/pop user would be surfing on flash loaded web pages, but with adblock on Firefox, it's not too bad. With Vista loaded on this computer, it would be annoyingly slow. You can often continue to use/re-use computer hardware with Linux for years after it would be unsuitable for Windows.

    I think these features are of high value to the average user -- maybe not the average Anandtech computer user -- but the average surf/email/do homework/look at photos/play solitaire/balance my checkbook user.

    Cheers!

    Mark.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    Using SMB for network performance is extremely biased. It's a proprietary Microsoft protocol, of course Microsoft is going to win that one. Use NFS, HTTP, FTP, SSH or some other open protocol for network performance benchmarking. Alot of NASes do support these, as they are Linux-based.

    Furthermore, using a Windows server with SMB with the argument that most consumer NAS use SMB is pretty ridiculous, these NASes are most likely going to use Samba, not native SMB, the Samba which is implemented in GNU/Linux distributions and Mac OS X, not to mention that most of the NASes that I've seen offer at least one of these protocols as an alternative.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    The ISO thing is pretty ridiculous, creating a simple GUI in both GTK and Qt and integrating them into Gnome and KDE should be pretty damn easy, though I suppose integration with the respective virtual file systems would be in order, in which case it might get slightly more complex for those (like me) not familiar with the code. There's even a FUSE (userspace filesystem) module now, so you wouldn't even need to be root to mount it.

    About the command-line support, IMO that's a good thing. It's a lot easier both for the person helping and the guy needing help to write/copy-paste a few commands than it is to tell the person to click that button, then that one then another one, etc. It's also alot easier for the guy needing help to simply paste the result if it didn't help, and it makes it much easier to diagnose the problem than if the user would attempt to describe the output. And you usually get much more useful information from the command-line utilities than you do from GUIs, the GUI simplifies the information so anyone can understand it, but at the price of making debugging a hell of a lot more difficult.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • seanlee - Tuesday, September 15, 2009 - link

    I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely hate it.
    1. User friendly. No, CLI is not user friendly no matter what you say; no matter what excuse you use; no matter how blind you are. NOT ONE COMPANY dare to provide their mainstream products to be CLI only, from things as simply as ATM, ipod, to things as complicate as cellphone, cars, airplane. That ought to tell you something--- CLI is not intuitive, not intuitive=sucks, so CLI = sucks. You command line fan boys are more than welcome to program punched cards, expect no one use punched cards and machine language anymore because they are counter-intuitive. Having to do CLI is a pain for average user, and having to do CLI every time to install a new program/driver is a nightmare. GUI is a big selling point, and a gapless computer-human user experience is what every software company looking to achieve.
    2. There is NOTHING a Linux can do that windows cannot. On the contrary, there are a lot of things windows can do that Linux cannot. I’d like to challenge any Linux user to find engineering software alternatives on Linux, like matlab, simulink, xilinx, orcad, labview, CAD… you cannot. For people who actually user their computer for productive means (not saying typing documents are not productive, but you can type document using type writer with no CPU required whatsoever), there is nothing, again, I repeat, NOTHING that Linux can offer me.
    3. Security issues. I disagree with the security issues that windows has. I can set up a vista machines, turn it on, luck it into a cage, and it will be as security as any Linux machine out there. Hell. If I bought a piece of rock, pretend it was a computer and stare it all day, it would be the most secure system known to the man-kind. Linux’s security is largely due to one of the two reasons: 1. Not popular, not enough software to support and to play with. 2. Not popular, un user-friendly. Either of them is not a good title to have. It is like you are free from the increase of the tax not because you have your business set up to write off all your expense, but because you don’t make any money thus you don’t have to pay tax.
    4. There is nothing revolutionary about Linux for an average user, other than it is free. If free is your biggest selling point, you are in serious trouble. Most people, if not all, would pay for quality product than a free stuff, unless it is just as good. Obviously Ubuntu is never going to be as good as windows because they don’t have the money that MS has. So what does Ubuntu have that really makes me want to switch and take few weeks of class to understand those commands?

    Be honest, people. If you only have ONE O/S to use, most of you guys will chose windows.
  • kensolar - Monday, October 26, 2009 - link


    I hope you realize that your hated is showing so strongly that absolutely no one cares what you say.
    That said, I don't know how to use a cli and have been successfully using Linux for 3 years. I found the article to be a fairly fair one even though the author is so unfamiliar with Linux/Ubuntu. As he does not use the default app's in windows, linux users don't use the defaults only in linux. K3B is far superior to Brassaro and so on. In addition, I don't think he let on very well as to the extent of the software available in the repositories (with addition repositories easy to add). Several hundred app's, 20,000 programs, even security app's and programs ranging from easy as pie to complicated. (for those of us how have a computer that is more than a rock) I personally do audio mixing, video transcoding, advanced printing....all with graphic interfaces.
    BTW, I learned how to turn on a computer for the 1st time 3 1/2 years ago, I stopped using windows a little over 3 years ago and have no reason to go back. I find it too hard, limiting and frustrating to use. Plus, I can't live w/o multiple desktops, the author didn't get it yet, but once you get used to them you can't go back.
    Well, I've said enough for now, can't wait for your next article.

Log in

Don't have an account? Sign up now