Installation

In terms of difficulty, right up there with making a good GUI is making a good installer. History is riddled with bad OS installers, with pre-Vista Windows being the most well-known example. Text mode installers running on severely castrated operating systems reigned for far too long. Microsoft of course improved this with Windows Vista in 2006, but even as late as the end of 2007 they were still releasing new operating systems such as Windows Home Server that used a partial text mode installer.

The reason I bring this up is that good OS installers are still a relatively recent development in the PC space, which is all the more reason I am very impressed with Ubuntu’s installer. It’s the opposite of the above, and more.

Right now Ubuntu is the underdog in a Windows dominated world, and their installation & distribution strategies have thusly been based on this. It’s undoubtedly a smart choice, because if Ubuntu wiped out Windows like Windows does Ubuntu, it would be neigh impossible to get anyone to try it out since “try out” and “make it so you can’t boot Windows” are mutually incompatible. Ubuntu plays their position very well in a few different ways.

First and foremost, the Ubuntu installation CD is not just an installer, but a live CD. It’s a fully bootable and usable copy of Ubuntu that runs off of the CD and does not require any kind of installation. The limitations of this are obvious since you can’t install additional software and CD disc access times are more than an order of magnitude above that of a hard drive, but nevertheless it enables you to give Ubuntu a cursory glance to see how it works, without needing to install anything. Live CDs aren’t anything new for Linux as a whole, but it bears mentioning, it’s an excellent strategy for letting people try out the OS.

This also gives Ubuntu a backdoor in to Windows users’ computers because as a complete CD-bootable OS, it can be used to recover trashed Windows installations when the Windows recovery agent can’t get the job done. It can read NTFS drives out of the box, allowing users to back up anything they read to another drive, such as USB flash drive. It also has a pretty good graphical partition editor, GParted, for when worse comes to worse and it comes time to start formatting. Ubuntu Live CD is not a complete recovery kit in and of itself (e.g. it can’t clean malware infections, so it’s more of a tool of last resort) but it’s a tool that has a purpose and serves it well.

Better yet, once you decide that you want to try an installable version of Ubuntu, but don’t want to take the plunge of messing with partitions, Ubuntu has a solution for that too. Wubi, the Windows-based Ubuntu Installer, allows you to install Ubuntu as a flat-file on an existing NTFS partition. Ubuntu can then boot off of the flat file, having never touched a partition or the master boot record (instead inserting an Ubuntu entry in to Windows BCD). This brings all the advantages of moving up from a Live CD to an installable version of Ubuntu, but without the system changes and absolute commitment a full install entails. Wubi installations are also easily removable, which further drives home this point.

Now the catch with a Wubi installation is that it’s meant to be a halfway house between a Live CD and a full installation, and it’s not necessarily meant for full-time use. As a flat file inside of a NTFS partition, there are performance issues related to the lower performance of the NTFS-3G driver over raw hard drive access, along with both external fragmentation of the flat file and internal fragmentation inside of the flat file. An unclean shutdown also runs the slight risk of introducing corruption in to the flat file or the NTFS file system, something the Wubi documentation makes sure to point out. As such Wubi is a great way to try out Ubuntu, but a poor way to continue using it.

Finally, once you’ve decided to go the full distance, there’s the complete Ubuntu installation procedure. As we’ve previously mentioned Ubuntu is a live CD, so installing Ubuntu first entails booting up the live CD – this is in our experience a bit slower than booting up a pared down installation-only OS environment such as Vista’s Windows PE. It should be noted that although you can use GParted at this point to make space to install Ubuntu, this is something that’s better left in the hands of Windows and its own partition shrinking ability due to some gotchas in that Windows can move files around to make space when GParted can’t.

Once the installation procedure starts, it’s just 6 steps to install the OS: Language, Time Zone, Keyboard Layout, Installation Location, and the credentials for the initial account. Notably the installation procedure calls for 7 steps, but I’ve only ever encountered 6, step 6 is always skipped. This puts it somewhere behind Mac OS X (which is composed of picking a partition and installing, credentials are handled later) and well ahead of Windows since you don’t need a damn key.

The only thing about the Ubuntu installation procedure that ruffles my feathers is that it doesn’t do a very good job of simplifying the installation when you want to install on a new partition but it’s not the only empty partition. This is an artifact of how Linux handles its swapfile – while Windows and Mac OS X create a file on the same partition as the OS, Linux keeps its swapfile on a separate partition. There are some good reasons for doing this such as preventing fragmentation of the swapfile and always being able to place it after the OS (which puts it further out on the disk, for higher transfer rates) but the cost is ease of installation. Ubuntu’s easy installation modes are for when you want to install to a drive (and wipe away its contents in the process) or when you want to install in the largest empty chunk of unpartitioned space. Otherwise, you must play with GParted as part of the installation procedure.

This means the most efficient way to install Ubuntu if you aren’t installing on an entire disk or immediately have a single free chunk of space (and it’s the largest ) is to play with partitions ahead of time so that the area you wish to install to is the largest free area. It’s a roundabout way to install Ubuntu and can be particularly inconvenient if you’re setting up a fresh computer and intend to do more than just dual boot.

Once all of the steps are completed, Ubuntu begins installing and is over in a few minutes. Upon completion Ubuntu installs its bootloader of choice, GRUB, and quickly searches for other OS installations (primarily Windows) and adds those entries to the GRUB bootloader menu. When this is done, the customary reboot occurs and when the system comes back up you’re faced with the GRUB boot menu – you’re ready to use Ubuntu. Ubuntu doesn’t treat its first booting as anything special, and there are no welcome or registration screens to deal with(I’m looking at you, Apple). It boots up, and you can begin using it immediately. It’s refreshing, to say the least.

The actual amount of time required to install Ubuntu is only on the order of a few minutes, thanks in large part due to its dainty size. Ubuntu comes on a completely filled CD, weighing in at 700MB, while Windows Vista is on a DVD-5 at over 3GB, and Mac OS X is on a whopping DVD-9 at nearly 8GB. It’s the fast to download (not that you can download Windows/Mac OS X) and fast to install.

We’ll get to the applications in-depth in a bit, but I’d like to quickly touch on the default installation of Ubuntu. Inside that 700MB is not only the core components of the OS and a web browser, but the complete Open Office suite and Evolution email client too. You can literally install Ubuntu and do most common tasks without ever needing to install anything else beyond security and application updates. Consider the amount of time it takes to install Microsoft Office on a Windows machine or a Mac, and it’s that much more time saved. Canonical is getting the most out of the 700MB a CD can hold.

UI & Usability Applications: Web Browsing
Comments Locked

195 Comments

View All Comments

  • ParadigmComplex - Wednesday, August 26, 2009 - link

    I concur - while most of the article is quite good, Ryan really seemed to have missed quite a bit here. His analysis of it seemed rather limited if not misleading.

    Not everything *has* to be a package - I have various scripts strewn around, along with Firefox 3.6a1 and a bunch of other things without having them organized properly as .deb's with APT. The packaging system is convenient if you want to use it, but it is not required.

    Additionally, Ryan made it seem as though everything must be installed through Synaptic or Add/Remove and that there where no stand-alone installers along the lines of Windows' .msi files. It's quite easy on Ubuntu to download a .deb file and double-click on it. In fact, it's much simpler then Windows' .msi files - there's no questions or hitting next. You just give it your password and it takes care of everything else.

    The one area I agree with Ryan is that there needs to be an standardized, easy, GUI fashion to add a repository (both the address and key) to APT. I have no problems with doing things like >>/etc/apt/sources.list, but I could see where others may. I suspect this could be done through a .deb, but I've never seen it done that way.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    Something I've been fishing for here and have not yet seen much of are requests for benchmarks. Part 2 is going to be 9.04 (no multi-Linux comparisons at this point, maybe later) and I'd like to know what you guys would like to see with respect to performance.

    We'll have a new i7 rig for 9.04, so I'll be taking a look at a few system level things (e.g. startup time) along side a look at what's new between 8.04 and 9.04. I'll also be taking a quick look at some compiler stuff and GPU-related items.

    Beyond that the board is open. Are there specific performance areas or applications that you guys would like to see(no laptops, please)? We're open to suggestions, so here's your chance to help us build a testing suite for future Linux articles.
  • cyriene - Monday, August 31, 2009 - link

    I'd like to see differences between PPD in World Community Grid between various Windows and Linux distros.
    I never really see AT talk about WCG or other distributed computing, but I figure if I'm gonna OC the crap out of my cpu, I might as well put it to good use.
  • Eeqmcsq - Thursday, August 27, 2009 - link

    Cross platform testing is pretty difficult, considering there are a multitude of different apps to accomplish the same task, some faster, some slower. And then there's the compiler optimizations for the same cross platform app as you mentioned in the article. However, I understand that from an end user's perspective, it's all about doing a "task". So just to throw a few ideas out there involving cross platform apps so that it's a bit more comparable...

    - Image or video conversion using GIMP or vlc.
    - Spreadsheet calculations using the Open Office Calc app.
    - Performance tests through VMware.
    - How about something java related? Java compiling, a java pi calculator app, or some other java single/multi threaded test app.
    - Perl or python script tests.
    - FTP transfer tests.
    - 802.11 b/g/whatever wireless transfer tests.
    - Hard drive tests, AHCI. (I read bad things about AMD's AHCI drivers, and that Windows AHCI drivers were OK. What about in Ubuntu?)
    - Linux software RAID vs "motherboard RAID", which is usually only available to Windows.
    - Linux fat32/NTFS format time/read/write tests vs Windows
    - Wasn't there some thread scheduling issues with AMD Cool and Quiet and Windows that dropped AMD's performance? What about in Linux?

    While I'm brainstorming, here's a few tests that's more about functionality and features than performance:
    - bluetooth connectivity, ip over bluetooth, etc
    - printing, detecting local/network printers
    - connected accessories, such as ipods, flash drives, or cameras through usb or firewire
    - detecting computers on the local network (Places -> Network)
    - multi channel audio, multi monitor video

    Just for fun:
    - Find a Windows virus/trojan/whatever that deletes files, unleash it in Ubuntu through Wine, see how much damage it does.
  • Veerappan - Thursday, August 27, 2009 - link

    I know you've said in previous comments that using Phoronix Test Suite for benchmarking different OSes (e.g. Ubuntu vs Vista) won't work because PTS doesn't install in Windows, but you could probably use a list of the available tests/suites in PTS as a place to get ideas for commonly available programs in Windows/OSX/Linux.

    I'm pretty sure that Unigine's Tropics/Sanctuary demos/benchmarks are available in Windows, so those could bench OpenGL/graphics.

    Maybe either UT2004 or some version of Quake or Doom 3 would work as gaming benchmarks. It's all going to be OpenGL stuff, but it's better than nothing. You could also do WoW in Wine, or Eve under Wine to test some game compatibility/performance.

    Once you get VDPAU working, I'd love to see CPU usage comparisons between windows/linux for media playback of H.264 videos. And also, I guess, a test without VDPAU/VAAPI working. Too bad for ATI that XvBA isn't supported yet... might be worth mentioning that in the article.

    You also might want to search around for any available OpenCL demos which exist. Nvidia's newest Linux driver supports OpenCL, so that might give you a common platform/API for programs to test.

    I've often felt that DVD Shrink runs faster in Wine than in Windows, so the time to run a DVD rip would be nice, but might have legal implications.

    Some sort of multitasking benchmark would be nice, but I'm not sure how you'd do it. Yeah, I can see a way of writing a shell script to automatically launch multiple benchmarks simultaneously (and time them all), but the windows side is a little tougher to me (some sort of batch script might work). Web Browsing + File Copy + Transcoding a video (or similar).

    Ooh... Encryption performance benchmarks might be nice. Either a test of how many PGP/GPG signs per second, or copying data between a normal disk partition, and a TrueCrypt partition. The TrueCrypt file copy test would be interesting to me, and would cover both encryption performance and some disk I/O.

    One last suggestion: Folding@Home benchmarks. F@H is available at least in CPU-driven form in Windows/Linux, and there's F@H benchmark methodologies already developed by other sites (e.g. techreport.com's CPU articles).

    Well, that's enough for now. Take or leave the suggestions as you see fit.
  • haplo602 - Thursday, August 27, 2009 - link

    you are out of luck here ... linux does not compare to windows because they are both different architectures. you already did what you could in the article.

    especialy in a binary distribution like Ubuntu, compilation speed tests are meaningless (but Gentoo folks would kiss your feet for that).

    boot up times are also not usefull. the init scripts and even init mechanisms are different from distro to distro.

    compression/filesystem benchmarks are half way usable. on windows you only have NTFS these days. on linux there are like 20 different filesystems that you can use (ext3/4, reiser, jfs and xfs are the most used. also quite many distros offer lvm/evms backends or software raid.

    I do not think there is much benchmarking you can do that will help in linux vs windows, even ubuntu vs windows because the same benchmars will differ between ubuntu versions.

    the only usable types are wine+game vs windows+game, native linux game vs same windows game (mostly limited to unreal and quake angines), some povray/blender tests and application comparisons (like you did with the firefox javascript speed).
  • GeorgeH - Wednesday, August 26, 2009 - link

    Not really a benchmark per se, but I'd be curious to know how the stereotypes of Windows being bloated and Ubuntu being slim and efficient translate to power consumption. Load and idle would be nice, but if at all possible I’d be much more curious to see a comparison of task energy draw, i.e. not so much how long it takes them to finish various tasks, but how much energy they need to finish them.

    In know that’d be a very difficult test to perform for what will probably be a boring and indeterminate result, but you asked. :)
  • ioannis - Wednesday, August 26, 2009 - link

    is there some kind of cross platform test that can be done to test memory usage? Maybe Firefox on both platforms? not sure.

    By "no laptops", I presume you mean, no battery tests (therefore power and as a consequence, heat). That would have been nice though. Maybe for those looking for a 'quiet' setup.

    but yes, definitely GPU (including video acceleration) and the GCC vs Visual Studio vs Intel compiler arena (along with some technical explanation why there are such huge differences)






  • ParadigmComplex - Wednesday, August 26, 2009 - link

    If you can, find games that are reported to work well under WINE and benchmark those against those running natively in Windows. It'd be interesting to see how the various differences between the two systems, and WINE itself, could effect benchmarks.
  • Fox5 - Wednesday, August 26, 2009 - link

    Number 1 use of Ubuntu is probably going to be for netbooks/low end desktops for people who just wanna browse the net.
    In that case, the browsing experience (including flash) should be investigated.
    Boot up time is important.
    Performance with differing memory amounts would be nice to see (say 256MB, 512MB, 1GB, and 2GB or higher). Scaling across cpus would be nice.

    Ubuntu as a programming environment versus windows would be good to see, including available IDEs and compiler and runtime performance.

    Ubuntu as a media server/HTPC would be good to see. Personally, I have my windows box using DAAP shares since Ubuntu interfaces with it much better than Samba. And as an HTPC, XBMC and Boxee are nice, cross-platform apps.

    Finally, how Ubuntu compares for more specific applications. For instance, scientific computing, audio editing, video editing, and image manipulation. Can it (with the addition of apps found through it's add/remove programs app) function well-enough in a variety of areas to be an all-around replacement for OSX or Windows?
    Speedwise, how do GIMP and Photoshop compare doing similar tasks? Is there anything even on par with Windows Movie Maker?
    What's Linux like game wise? Do flash games take a noticeable performance hit? Are the smattering of id software games and quake/super mario bros/tetris/etc clones any good? How does it handle some of the more popular WINE games?

Log in

Don't have an account? Sign up now