Things That Went Terribly, Terribly Wrong

One concern I’ve had for some time when writing this article is that it runs the risk of coming off as too negative. I don’t want to knock Ubuntu just for being different, but at the same time I’m not going to temper my expectations much as far as usability, stability, and security are concerned. If something went wrong, then I intend to mention it, as these are things that can hopefully be resolved in a future version of Ubuntu.

This section is reserved for those things that went terribly, terribly wrong. Things so wrong that it made me give up on using Ubuntu for the rest of the day and go back to Windows. This isn’t intended to be a list of all the problems (or even just the big problems) I encountered using Ubuntu, but rather the most severe.

We’ll start with mounting file servers. I have a Windows Home Server box that I use to store my common files, along with hosting backups of my Macs and PCs. I needed to be able to access the SMB shares on that server, which immediately puts Linux at a bit of a disadvantage since it’s yet another non-native Microsoft protocol that Linux has to deal with, with protocol details that were largely reverse engineered. My Macs have no issue with this, so I was not expecting any real problems here, other than that the network throughput would likely be lower than from Windows.

For whatever reason, Ubuntu cannot see the shares on my WHS box, which is not a big deal since neither do my Macs. What went wrong however is that manually mounting these shares is far harder than it needs to be. Again using the Mac as a comparison, mounting shares is as easy as telling Finder to connect to a SMB server, and supplying credentials, at which point it gives you a list of shares to mount.

Ubuntu, as it turns out, is not capable of mounting a share based on just the server name and credentials. It requires the share name along with the above information , at which point it will mount that share. Browsing shares based on just a user name and password is right out. Worse yet, if you don’t know this and attempt to do it Mac-style, you’ll get one of the most cryptic error messages I have ever seen: “Can't display location "smb://<removed>/", No application is registered as handling this file.” This tells you nothing about what the problem actually is. It’s poor design from a usability standpoint, and even worse error handling.

Unfortunately the story doesn’t end here. Ideally all applications would work as well with files on a network share as they would a local drive, but that’s not always the case – often the problem is that it’s harder to browse for a network shared file than a local file from inside an application. For this reason I have all of my common shares mapped as drives on Windows (this also saves effort on logging in) and Mac OS X takes this even further and immediately maps all mounted shares as drives. So I wanted to do the same for Ubuntu, and have my common shares automount as drives.

Nautilus, which transparently accesses SMB shares, is of no help here, because by transparently accessing SMB shares it doesn’t mount them in a standard way. The mount point it uses is inside of a hidden directory (.gvfs) that some applications will ignore. The ramifications of this being that most applications that are not a GTK application cannot see shares mounted by Nautilus, because they can’t see the mounted share that GTK tells its applications about, nor can they see the hidden mount point. The chief concern in my case was anything running under Wine, along with VLC.

The solution is not for the faint of heart. Highlights include additional software installations, manually backing up files, and a boatload of archaic terminal commands – and that’s just if everything goes right the first time. I love the terminal but this is ridiculous. Once it’s finished and set up correctly it gets the job done, but it’s an unjust amount of effort for something that can be accomplished in a matter of seconds on Windows or Mac OS X. This was easily the lowest point I reached while using Ubuntu.

The other thing I am going to throw in this category is mounting ISO images. I keep ISOs of all of my software for easy access. Interestingly enough, Ubuntu has the file system driver necessary to mount ISOs, but not a GUI application to do this. While it would be nice to have all of that built-in (ala Mac OS X) that’s not the flaw here – I’m perfectly content downloading a utility like I do for Windows (Daemon Tools). The flaw here was the Ubuntu GUI application for this, Gmount-ISO, can’t mount ISOs off of a SMB share. Worse yet, it doesn’t tell you this either.

The first time around, the only solution I was able to find was an another archaic CLI command that involved running the mount command by hand, in the style of “mount file.iso /cdrom -t iso9660 -o loop”. This was a terrible solution.

It wasn’t until some time later that I finally found a better solution. An application that wasn’t in the Ubuntu repository, AcetoneISO, can properly mount files off of SMB shares. Better yet it’s a bit closer to Daemon Tools functionality, since it can mount BIN/CUE, NRG (Nero Image), and MDF images.

I throw this in “terribly, terribly wrong” column because the solution was completely non-obvious. If you search for “Ubuntu Hardy mount iso” or something similar, AcetoneISO is nowhere near the top of the results, and the Ubuntu package repository is of no help. What’s in the repository is the aforementioned useless Gmount-ISO, and what’s at the top of Google’s results are Gmount-ISO and instructions to mount the image via CLI. It’s a success story in the end, but it was uncomfortably painful getting there.

If there’s any consolation in these matters, it’s that these were the only two issues that made me outright stop using Ubuntu, and go back to Windows for the day. Any other problems I had were significantly less severe than this.

Applications: Everything Else Things That Went Right
Comments Locked

195 Comments

View All Comments

  • ParadigmComplex - Wednesday, August 26, 2009 - link

    I concur - while most of the article is quite good, Ryan really seemed to have missed quite a bit here. His analysis of it seemed rather limited if not misleading.

    Not everything *has* to be a package - I have various scripts strewn around, along with Firefox 3.6a1 and a bunch of other things without having them organized properly as .deb's with APT. The packaging system is convenient if you want to use it, but it is not required.

    Additionally, Ryan made it seem as though everything must be installed through Synaptic or Add/Remove and that there where no stand-alone installers along the lines of Windows' .msi files. It's quite easy on Ubuntu to download a .deb file and double-click on it. In fact, it's much simpler then Windows' .msi files - there's no questions or hitting next. You just give it your password and it takes care of everything else.

    The one area I agree with Ryan is that there needs to be an standardized, easy, GUI fashion to add a repository (both the address and key) to APT. I have no problems with doing things like >>/etc/apt/sources.list, but I could see where others may. I suspect this could be done through a .deb, but I've never seen it done that way.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    Something I've been fishing for here and have not yet seen much of are requests for benchmarks. Part 2 is going to be 9.04 (no multi-Linux comparisons at this point, maybe later) and I'd like to know what you guys would like to see with respect to performance.

    We'll have a new i7 rig for 9.04, so I'll be taking a look at a few system level things (e.g. startup time) along side a look at what's new between 8.04 and 9.04. I'll also be taking a quick look at some compiler stuff and GPU-related items.

    Beyond that the board is open. Are there specific performance areas or applications that you guys would like to see(no laptops, please)? We're open to suggestions, so here's your chance to help us build a testing suite for future Linux articles.
  • cyriene - Monday, August 31, 2009 - link

    I'd like to see differences between PPD in World Community Grid between various Windows and Linux distros.
    I never really see AT talk about WCG or other distributed computing, but I figure if I'm gonna OC the crap out of my cpu, I might as well put it to good use.
  • Eeqmcsq - Thursday, August 27, 2009 - link

    Cross platform testing is pretty difficult, considering there are a multitude of different apps to accomplish the same task, some faster, some slower. And then there's the compiler optimizations for the same cross platform app as you mentioned in the article. However, I understand that from an end user's perspective, it's all about doing a "task". So just to throw a few ideas out there involving cross platform apps so that it's a bit more comparable...

    - Image or video conversion using GIMP or vlc.
    - Spreadsheet calculations using the Open Office Calc app.
    - Performance tests through VMware.
    - How about something java related? Java compiling, a java pi calculator app, or some other java single/multi threaded test app.
    - Perl or python script tests.
    - FTP transfer tests.
    - 802.11 b/g/whatever wireless transfer tests.
    - Hard drive tests, AHCI. (I read bad things about AMD's AHCI drivers, and that Windows AHCI drivers were OK. What about in Ubuntu?)
    - Linux software RAID vs "motherboard RAID", which is usually only available to Windows.
    - Linux fat32/NTFS format time/read/write tests vs Windows
    - Wasn't there some thread scheduling issues with AMD Cool and Quiet and Windows that dropped AMD's performance? What about in Linux?

    While I'm brainstorming, here's a few tests that's more about functionality and features than performance:
    - bluetooth connectivity, ip over bluetooth, etc
    - printing, detecting local/network printers
    - connected accessories, such as ipods, flash drives, or cameras through usb or firewire
    - detecting computers on the local network (Places -> Network)
    - multi channel audio, multi monitor video

    Just for fun:
    - Find a Windows virus/trojan/whatever that deletes files, unleash it in Ubuntu through Wine, see how much damage it does.
  • Veerappan - Thursday, August 27, 2009 - link

    I know you've said in previous comments that using Phoronix Test Suite for benchmarking different OSes (e.g. Ubuntu vs Vista) won't work because PTS doesn't install in Windows, but you could probably use a list of the available tests/suites in PTS as a place to get ideas for commonly available programs in Windows/OSX/Linux.

    I'm pretty sure that Unigine's Tropics/Sanctuary demos/benchmarks are available in Windows, so those could bench OpenGL/graphics.

    Maybe either UT2004 or some version of Quake or Doom 3 would work as gaming benchmarks. It's all going to be OpenGL stuff, but it's better than nothing. You could also do WoW in Wine, or Eve under Wine to test some game compatibility/performance.

    Once you get VDPAU working, I'd love to see CPU usage comparisons between windows/linux for media playback of H.264 videos. And also, I guess, a test without VDPAU/VAAPI working. Too bad for ATI that XvBA isn't supported yet... might be worth mentioning that in the article.

    You also might want to search around for any available OpenCL demos which exist. Nvidia's newest Linux driver supports OpenCL, so that might give you a common platform/API for programs to test.

    I've often felt that DVD Shrink runs faster in Wine than in Windows, so the time to run a DVD rip would be nice, but might have legal implications.

    Some sort of multitasking benchmark would be nice, but I'm not sure how you'd do it. Yeah, I can see a way of writing a shell script to automatically launch multiple benchmarks simultaneously (and time them all), but the windows side is a little tougher to me (some sort of batch script might work). Web Browsing + File Copy + Transcoding a video (or similar).

    Ooh... Encryption performance benchmarks might be nice. Either a test of how many PGP/GPG signs per second, or copying data between a normal disk partition, and a TrueCrypt partition. The TrueCrypt file copy test would be interesting to me, and would cover both encryption performance and some disk I/O.

    One last suggestion: Folding@Home benchmarks. F@H is available at least in CPU-driven form in Windows/Linux, and there's F@H benchmark methodologies already developed by other sites (e.g. techreport.com's CPU articles).

    Well, that's enough for now. Take or leave the suggestions as you see fit.
  • haplo602 - Thursday, August 27, 2009 - link

    you are out of luck here ... linux does not compare to windows because they are both different architectures. you already did what you could in the article.

    especialy in a binary distribution like Ubuntu, compilation speed tests are meaningless (but Gentoo folks would kiss your feet for that).

    boot up times are also not usefull. the init scripts and even init mechanisms are different from distro to distro.

    compression/filesystem benchmarks are half way usable. on windows you only have NTFS these days. on linux there are like 20 different filesystems that you can use (ext3/4, reiser, jfs and xfs are the most used. also quite many distros offer lvm/evms backends or software raid.

    I do not think there is much benchmarking you can do that will help in linux vs windows, even ubuntu vs windows because the same benchmars will differ between ubuntu versions.

    the only usable types are wine+game vs windows+game, native linux game vs same windows game (mostly limited to unreal and quake angines), some povray/blender tests and application comparisons (like you did with the firefox javascript speed).
  • GeorgeH - Wednesday, August 26, 2009 - link

    Not really a benchmark per se, but I'd be curious to know how the stereotypes of Windows being bloated and Ubuntu being slim and efficient translate to power consumption. Load and idle would be nice, but if at all possible I’d be much more curious to see a comparison of task energy draw, i.e. not so much how long it takes them to finish various tasks, but how much energy they need to finish them.

    In know that’d be a very difficult test to perform for what will probably be a boring and indeterminate result, but you asked. :)
  • ioannis - Wednesday, August 26, 2009 - link

    is there some kind of cross platform test that can be done to test memory usage? Maybe Firefox on both platforms? not sure.

    By "no laptops", I presume you mean, no battery tests (therefore power and as a consequence, heat). That would have been nice though. Maybe for those looking for a 'quiet' setup.

    but yes, definitely GPU (including video acceleration) and the GCC vs Visual Studio vs Intel compiler arena (along with some technical explanation why there are such huge differences)






  • ParadigmComplex - Wednesday, August 26, 2009 - link

    If you can, find games that are reported to work well under WINE and benchmark those against those running natively in Windows. It'd be interesting to see how the various differences between the two systems, and WINE itself, could effect benchmarks.
  • Fox5 - Wednesday, August 26, 2009 - link

    Number 1 use of Ubuntu is probably going to be for netbooks/low end desktops for people who just wanna browse the net.
    In that case, the browsing experience (including flash) should be investigated.
    Boot up time is important.
    Performance with differing memory amounts would be nice to see (say 256MB, 512MB, 1GB, and 2GB or higher). Scaling across cpus would be nice.

    Ubuntu as a programming environment versus windows would be good to see, including available IDEs and compiler and runtime performance.

    Ubuntu as a media server/HTPC would be good to see. Personally, I have my windows box using DAAP shares since Ubuntu interfaces with it much better than Samba. And as an HTPC, XBMC and Boxee are nice, cross-platform apps.

    Finally, how Ubuntu compares for more specific applications. For instance, scientific computing, audio editing, video editing, and image manipulation. Can it (with the addition of apps found through it's add/remove programs app) function well-enough in a variety of areas to be an all-around replacement for OSX or Windows?
    Speedwise, how do GIMP and Photoshop compare doing similar tasks? Is there anything even on par with Windows Movie Maker?
    What's Linux like game wise? Do flash games take a noticeable performance hit? Are the smattering of id software games and quake/super mario bros/tetris/etc clones any good? How does it handle some of the more popular WINE games?

Log in

Don't have an account? Sign up now