Things That Went Right

On the flip side of the things that went wrong, we have the things that went right. Most of the Ubuntu experience went right and has been covered previously, so this is going to be a catch-all for other things about Ubuntu that impressed me, but don’t necessarily fit anywhere else.

One of the nicer features of Mac OS X that you don’t see mentioned very much is the Keychain, a credential management framework for applications to use to securely store passwords and the like. Such systems aren’t rare – even Windows has something similar through its Credentials Manager – but Mac OS X is unique in that its implementation at least gets used, at times.

I had not been expecting something similar in Ubuntu, so it caught my eye when a Mac OS-like password box came up when I was logging in to my file server. As it turns out Ubuntu has similar functionality through the Passwords and Encryption Keys application. And since Ubuntu heavily uses the GNOME desktop environment that this application is a part of, a number of its applications are built against the keyring and use it.

It’s not quite as tightly woven as Keychain is under Mac OS X, but it’s better utilized than Windows and used enough that it makes sense to visit the keyring application. The biggest holdout with a stock install is Firefox, which uses its own password manager regardless of what platform it’s on.

Another thing that caught my eye was Ubuntu’s archive manager, called File Roller here. As we’ve lamented many, many times before, Windows’ archive management abilities are terrible. Files are slow to compress, files are slow to uncompress, and just supporting Zip files isn’t quite enough. Mac OS X does a bit better by being faster, but it also has absolutely no support for browsing Zip archives, it just packs and unpacks them. Most power users I know will have something like WinRAR or BetterZip installed to get a proper archive browser and wider archive support.

File Roller is a complete archive manager, and it supports slightly more exotic archive formats like RAR along with the customary Zip and *nix standard of GZip. The biggest knock against it when it comes to archive formats is that it can read more than it can write, RAR again being the example here.

This also brings up an interesting quirk with archives under *nix that you don’t see under Windows. The Zip format specifies it as being both a container for multiple files and a compressor for those files. GZip on the other hand can only compress a single file – so when it comes time to compress multiple files, they must first be packed in a compressionless tarball (TAR), and then the tarball is compressed, resulting in .tar.gz. The quirk is that the Zip format compresses each file separately, while .tar.gz by its very nature compresses all the files together at once; this is commonly known as solid archiving.

Depending on the files being compressed, solid archives can have significant space advantages over individually compressed files by taking advantage of redundancy between the files themselves, and not just the redundancy in individual files. This is also why WinRAR is so common on Windows machines, since the RAR format supports solid and individual archiving.

Now the downside to solid archiving is that it takes longer to pull a file out of a solid archive than an individually compressed archive, since everything ahead of the file must be decompressed first in order to retrieve the data needed to recreate the desired file. So solid archiving isn’t necessarily the best way to go.

Ultimately with the wider support for archive formats under Ubuntu, in some situations it can achieve much better compression ratios than what can be done under Windows. Windows isn’t entirely helpless since when it comes to installers they can use MSI installers (which use solid compression), but as far as plain archives are concerned the only built-in option is individual archiving. It’s a small benefit that can pay out nicely from time to time for Ubuntu.

Things That Went Terribly, Terribly Wrong Wine
Comments Locked

195 Comments

View All Comments

  • jasperjones - Wednesday, August 26, 2009 - link

    I second most of Fox5's suggestion.

    1.) I've been completely ignorant of software development on Windows over the last few years. Comparison of MS Visual Studio vs Eclipse or vs Netbeans/Sun Studio? How fast are CLI C++ apps on Windows vs. Linux? Perhaps using both GNU and Intel C++ Compiler toolchains on Linux. And possibly MS Visual C++ and Intel Visual C++ on Windows.

    Perhaps less esoteric, 2.) instead of benching SMB/CIFS on Windows vs Samba on *nix, bench something *nix native such as scp/sftp or nfs. Netperf.

    3.) Number-crunching stuff. I guess this is sort of similar to running at least a few synthetic benches. LINPACK or some other test that uses BLAS or LAPACK, tests that use FFTW. Maybe even SPEC (I wouldn't expect any exciting results here, though, or are there?)
  • Eeqmcsq - Wednesday, August 26, 2009 - link

    Are you looking for benchmarks in Windows vs Ubuntu with the same hardware? Or benchmarks in different CPUs/motherboards/etc with the same Ubuntu?
  • Ryan Smith - Wednesday, August 26, 2009 - link

    Cross-platform. There's no problem coming up with Linux-only benchmarks for hardware.
  • Eeqmcsq - Wednesday, August 26, 2009 - link

    I have a question about your benchmarks that involve files, such as copying and zipping. When you run your benchmarks, do you run them multiple times and then get an average? I ask that because I have learned that in Linux, files get cached into memory, so subsequent runs will appear faster. I suspect the same thing happens in Windows. Do you take that into account by clearing cached memory before each run?
  • Ryan Smith - Wednesday, August 26, 2009 - link

    We reboot between runs to avoid cache issues (and in the case of Windows, wait for it to finish filling the SuperFetch cache).
  • fri2219 - Wednesday, August 26, 2009 - link

    I heard Sony is coming out with this thing they call a Walkman.

    You should review that next!
  • StuckMojo - Wednesday, August 26, 2009 - link

    ROFL!
  • Fox5 - Wednesday, August 26, 2009 - link

    The LTS is really for the same types of people that avoid grabbing the latest MS service pack. IE, anyone who's still running Windows XP SP2 with IE6. Do that comparison and see how they compare.

    Ubuntu is little more than a tight integration of many well-tested packages, there's no reason to go with ubuntu's LTS when everything else already goes through it's own extensive testing. Given how quickly open source software advances, I'd say the LTS is probably less stable than the most up to date versions, and certainly far behind on usability.

    You want the equivalent of Ubuntu's LTS in Windows? It most closely matches the progression that the Windows server versions follow.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    To put things in perspective, 8.04 was released shortly after Vista SP1 and XP SP3 were. So Hardy vs. XP SP2 (a 4 year old SP) is a pretty poor comparison.

    You'll see an up to date comparison in part 2 when we look at 9.04.
  • awaken688 - Wednesday, August 26, 2009 - link

    I'm glad you did this article. It really has been something I think about. I'm ready to read your Part II. As others have mentioned, I have a couple of other articles that would be great.

    1) The comparison of the various versions as mentioned. SuSe, Ubuntu 9.04, BSD, etc...

    2) Someone mentioned VirtualBox. I'd love to hear more about this including a detailed setup for the normal user. I'd love to be able to surf while in Linux, but able to play games in Windows and keep them separate for added security.

    Thanks for the article! Hope to see one or both of the ideas mentioned above covered.

Log in

Don't have an account? Sign up now