Things That Went Terribly, Terribly Wrong

One concern I’ve had for some time when writing this article is that it runs the risk of coming off as too negative. I don’t want to knock Ubuntu just for being different, but at the same time I’m not going to temper my expectations much as far as usability, stability, and security are concerned. If something went wrong, then I intend to mention it, as these are things that can hopefully be resolved in a future version of Ubuntu.

This section is reserved for those things that went terribly, terribly wrong. Things so wrong that it made me give up on using Ubuntu for the rest of the day and go back to Windows. This isn’t intended to be a list of all the problems (or even just the big problems) I encountered using Ubuntu, but rather the most severe.

We’ll start with mounting file servers. I have a Windows Home Server box that I use to store my common files, along with hosting backups of my Macs and PCs. I needed to be able to access the SMB shares on that server, which immediately puts Linux at a bit of a disadvantage since it’s yet another non-native Microsoft protocol that Linux has to deal with, with protocol details that were largely reverse engineered. My Macs have no issue with this, so I was not expecting any real problems here, other than that the network throughput would likely be lower than from Windows.

For whatever reason, Ubuntu cannot see the shares on my WHS box, which is not a big deal since neither do my Macs. What went wrong however is that manually mounting these shares is far harder than it needs to be. Again using the Mac as a comparison, mounting shares is as easy as telling Finder to connect to a SMB server, and supplying credentials, at which point it gives you a list of shares to mount.

Ubuntu, as it turns out, is not capable of mounting a share based on just the server name and credentials. It requires the share name along with the above information , at which point it will mount that share. Browsing shares based on just a user name and password is right out. Worse yet, if you don’t know this and attempt to do it Mac-style, you’ll get one of the most cryptic error messages I have ever seen: “Can't display location "smb://<removed>/", No application is registered as handling this file.” This tells you nothing about what the problem actually is. It’s poor design from a usability standpoint, and even worse error handling.

Unfortunately the story doesn’t end here. Ideally all applications would work as well with files on a network share as they would a local drive, but that’s not always the case – often the problem is that it’s harder to browse for a network shared file than a local file from inside an application. For this reason I have all of my common shares mapped as drives on Windows (this also saves effort on logging in) and Mac OS X takes this even further and immediately maps all mounted shares as drives. So I wanted to do the same for Ubuntu, and have my common shares automount as drives.

Nautilus, which transparently accesses SMB shares, is of no help here, because by transparently accessing SMB shares it doesn’t mount them in a standard way. The mount point it uses is inside of a hidden directory (.gvfs) that some applications will ignore. The ramifications of this being that most applications that are not a GTK application cannot see shares mounted by Nautilus, because they can’t see the mounted share that GTK tells its applications about, nor can they see the hidden mount point. The chief concern in my case was anything running under Wine, along with VLC.

The solution is not for the faint of heart. Highlights include additional software installations, manually backing up files, and a boatload of archaic terminal commands – and that’s just if everything goes right the first time. I love the terminal but this is ridiculous. Once it’s finished and set up correctly it gets the job done, but it’s an unjust amount of effort for something that can be accomplished in a matter of seconds on Windows or Mac OS X. This was easily the lowest point I reached while using Ubuntu.

The other thing I am going to throw in this category is mounting ISO images. I keep ISOs of all of my software for easy access. Interestingly enough, Ubuntu has the file system driver necessary to mount ISOs, but not a GUI application to do this. While it would be nice to have all of that built-in (ala Mac OS X) that’s not the flaw here – I’m perfectly content downloading a utility like I do for Windows (Daemon Tools). The flaw here was the Ubuntu GUI application for this, Gmount-ISO, can’t mount ISOs off of a SMB share. Worse yet, it doesn’t tell you this either.

The first time around, the only solution I was able to find was an another archaic CLI command that involved running the mount command by hand, in the style of “mount file.iso /cdrom -t iso9660 -o loop”. This was a terrible solution.

It wasn’t until some time later that I finally found a better solution. An application that wasn’t in the Ubuntu repository, AcetoneISO, can properly mount files off of SMB shares. Better yet it’s a bit closer to Daemon Tools functionality, since it can mount BIN/CUE, NRG (Nero Image), and MDF images.

I throw this in “terribly, terribly wrong” column because the solution was completely non-obvious. If you search for “Ubuntu Hardy mount iso” or something similar, AcetoneISO is nowhere near the top of the results, and the Ubuntu package repository is of no help. What’s in the repository is the aforementioned useless Gmount-ISO, and what’s at the top of Google’s results are Gmount-ISO and instructions to mount the image via CLI. It’s a success story in the end, but it was uncomfortably painful getting there.

If there’s any consolation in these matters, it’s that these were the only two issues that made me outright stop using Ubuntu, and go back to Windows for the day. Any other problems I had were significantly less severe than this.

Applications: Everything Else Things That Went Right
Comments Locked

195 Comments

View All Comments

  • Kakao - Wednesday, August 26, 2009 - link

    Ryan, nowadays you don't need to dual boot. You can just set up a virtual machine. If you are a gamer use Windows as host and setup a Linux distro as guest. If you have enough memory, 4GB is very good, you can have both perfectly usable at the same time. I'm using Virtual Box and it works great.
  • VaultDweller - Wednesday, August 26, 2009 - link

    "Manufacturer: Canon"

    I think you mean Canonical.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    It wasn't in our DB when I wrote the article, it was supposed to be added before it went live. Whoops.

    Thanks you.
  • Proteusza - Wednesday, August 26, 2009 - link

    I havent been able to read the whole cos I'm currently at work, but so far it seems good. Some people have been saying you should be testing 9.04, and I can see their point, but on the other hand, I agree that since 8.04 is the latest LTS release, it should be pretty stable still.

    Nonetheless, perhaps you could compare a later non LTS release to a service pack for Windows? I mean, there is some new functionality and some fixes. Granted, new versions of Ubuntu contain a lot more few features than Windows service packs.

    I agree that the 6 month release cycle is too fast. I dont develop for Ubuntu myself, but I imagine a lot of time will be wasted on preparing for release twice a year. I mean, theres a lot of testing, bugfixing and documentation to be done, and I would think if you would only did that once a year, you would have more time for development. Although, I guess the more changes you do in a release the more you should test, so maybe thats invalid.

    I've also never really liked the Linux filesystem and package manager idea. Granted, package managers especially have improved a lot lately, and personally I think we have Ubuntu to thank for that, with its huge focus on usability, which historically Linux hasnt cared at all about.

    I also dont like over reliance on the terminal/CLI. I dont like that there are certain things that can only be done with it. Its easier and faster for me to do things with a GUI, because we are visual creatures and a GUI is a much better way of displaying information than just plain text. I think until a lot of the Linux developers get over the idea that the CLI is "the only way to go", the GUI will be underdeveloped. As I said, its only recently that some Linux developers have actually bothered to try to get the various desktop managers up to scratch.

    The other thing I find interesting about Ubuntu, is the nerd rage that some Debian developers exhibit towards Ubuntu.

    Anyway... when 9.10 comes out, I would love to see your impressions of the difference.
  • R3MF - Wednesday, August 26, 2009 - link

    i thoroughly approve of AT running linux articles..........

    however i didn't bother to read this one as anything from Q2 2008 is of zero interest to me now.

    may i suggest a group-test to be published around Xmas of the following Q4 2009 distro releases:
    Ubuntu 9.04
    opensuse 11.2
    fedora 12 (?)
    Mandiva 2010

    that would be awesome AND relevant to your readers.
  • CityZen - Wednesday, August 26, 2009 - link

    I was one of those waiting for this article. I do remember getting excited when it was promised back in ... (can't recall the year, sorry, it's been too long :) ). Anyway, the wait seems to have been worth it. Excellent article.
    A suggestion for part 2: install LinuxMint 7 (apart from Ubuntu 9.04) and see which of the problems you found in part 1 with Ubuntu 8.04 are solved in LinuxMint "out of the box"
  • captainentropy - Tuesday, September 1, 2009 - link

    I totally agree! To hell with Ubuntu, Mint7 is the best linux distro by far. Before I settled on Mint I tried Ubuntu, Kubuntu, PCLinuxOS (my previous fave), Mepis, Scientific, openSUSE, Fedora, Slackware, CentOS, Mandriva, and RedHat. None could come close to the complete awesomeness, beauty, out-of-the-box completeness, and ease of use as Mint7.

    I'm a scientist and I'm using it for sequence and image analysis, so far.
  • haplo602 - Wednesday, August 26, 2009 - link

    so I got to page before installation and I have so many comments I cannot read further :-)

    I am using linux on and off as my main desktop system since redhat 6.0 (that's kernel 2.2 iirc) so some 10 years. my job is a unix admin. so I am obviously biased :-)

    1. virtual desktops - while this heavily depends on your workflow, it helps organise non-conflicting windows to not occupy the same space. I used to have one for IM/email, one with just web browser, one with my IDE and work stuff and one for GIMP and Blender. while this is my preference, it helps to kill the notification hell that is Windows. I hate how Windows steals focus from whatever I am working on just because some unimportant IM event just occured.

    2. package manager and filesystem. given my background, the linux FHS is my 2nd nature. however you failed to grasp the importance of the package manager here. it effectively hides the FHS from you so you do not need to clean up manualy after uninstall. all directories you should ever go into manualy are /etc, your home dir, the system mount directory and whatever the log directory is. If you need to acccess other directories manualy, then you are either a system developer, a programmer or too curious :-)

    also you can usualy one-click install .deb packages and they appear in the package manager as usual. just you have to manage dependencies manualy in that case. repositories are nice as you need to set them up ONCE and then all your updates/future versions are taken care of.

    3. missing executable icons - this has a lot more background to it but it is a mistake to use nautilus in the default icon mode. you basicaly cannot live withour ownership/permissions displayed on a unix system. trying to hide this in any way in a GUI is a capital mistake. that's why a windows explorer like file manager is not usable under linux. good old MC :-) anyway an executable file can be anything from a shell script to a binary file. you just have to have the correct launcher registered in the system and you can open anything. basicaly same as windows just not that much gui friendly.

    4. NVIDIA/ATI drivers - this is a story in itself. use NVIDIA if you want easy of use. use ATI if you want to learn about kernel and X :-) dig through phoronix.com for more info.

    ok I will post more comments as I read further :-)
  • haplo602 - Wednesday, August 26, 2009 - link

    so I read the whole article. I would have some more comments :-)

    1. installation - for me this was never a problem on any linux distro I was using. my partition scheme does not change much and it is usualy the trickiest part of the whole installation process. try out the full gentoo 3 stage installation if you want some fun (ok it is not avaiable via normal means anymore).

    2. fonts - as you mentioned with codecs, there are software restrictions and licensing policies governing linux distributions. ms fonts are licensed under different terms than GPL software. yes even FOTNS have licenses. so they are generaly not included in linux distributions by default.

    What I missed from the article is the amount of customisation you can do with a typical linux distro. just ubuntu has 3 main variants and you can mix and match them at will. you can even have all 3 installed and switch between the window managers by user preference.

    Since you did not like the package manager anyway, you missed on the main Linux strength - application variability.

    From a common user perspective however, the article is quite correct. I would expect more from a seasoned windows user and AT editor.
  • n0nsense - Wednesday, August 26, 2009 - link

    Ubuntu 8.04 is 14 months old creature.
    2 versions released after it and the third one should arrive in October.
    In terms of Windows it's short time, but for Linux it's a lot of time.
    I suggest your next review should be done on Ubuntu 9.10 instead of 9.04 (which IMHO is better than 8.04 but still lacks some polish).

    As mentioned before, the advantage of CLI instructions is that it will work on any Desktop Environment (Gnome, KDE, XFCE etc.) if it's not related to the DE itself. Moreover it will work on different versions (older/newer).
    For example in Vista/7 i couldn't find Network Connections in GUI.
    But who can stop me to type "Network Connections" in Explorer's address bar ? Sometimes GUI changed and even if only a little, most people will fail to follow screen shots. not to mention that most desktops are so customized (on real geek's computers) that it looks too different. I'm not talking about icons or desktop background. I'm talking about panels (if any at all), docks, menus, context menus etc. in Linux almost everything can be changed. And old-school geeks that had their Linux installations for years do this things so each DE is probably unique. (I have Gnome and apps settings/tweaks for over 7 years. Some of them probably never changed). The trick is that even when you reinstall the system, your personal setting may stay with you. (I jumped form Debian to Ubuntu to Gentto back to Ubuntu to Ubuntu x86_64 and finally to Gentoo x86_64). After all this, i have not lost any user customization/setting. On the system level it's harder since Debian and Gentoo are very different. All this gives you motivation to change and to tweak to make it better. Windows users are not really can customize and when they do, it's only valid until they have to reinstall/upgrade their OS. Since most of the Windows users I know reinstall at least once a year, after few cycles they will stay with defaults for both OS and applications.

    Switch to Linux is not the easiest thing. It's usually not "love from first sight" story. But if somehow you stayed around and get to know it, you can't be separated after :)
    Even on Windows 7 i feel handicapped in terms of usability and effectiveness/productivity. (I spend more time in front of Windows then Linux computers)

Log in

Don't have an account? Sign up now