Things That Went Terribly, Terribly Wrong

One concern I’ve had for some time when writing this article is that it runs the risk of coming off as too negative. I don’t want to knock Ubuntu just for being different, but at the same time I’m not going to temper my expectations much as far as usability, stability, and security are concerned. If something went wrong, then I intend to mention it, as these are things that can hopefully be resolved in a future version of Ubuntu.

This section is reserved for those things that went terribly, terribly wrong. Things so wrong that it made me give up on using Ubuntu for the rest of the day and go back to Windows. This isn’t intended to be a list of all the problems (or even just the big problems) I encountered using Ubuntu, but rather the most severe.

We’ll start with mounting file servers. I have a Windows Home Server box that I use to store my common files, along with hosting backups of my Macs and PCs. I needed to be able to access the SMB shares on that server, which immediately puts Linux at a bit of a disadvantage since it’s yet another non-native Microsoft protocol that Linux has to deal with, with protocol details that were largely reverse engineered. My Macs have no issue with this, so I was not expecting any real problems here, other than that the network throughput would likely be lower than from Windows.

For whatever reason, Ubuntu cannot see the shares on my WHS box, which is not a big deal since neither do my Macs. What went wrong however is that manually mounting these shares is far harder than it needs to be. Again using the Mac as a comparison, mounting shares is as easy as telling Finder to connect to a SMB server, and supplying credentials, at which point it gives you a list of shares to mount.

Ubuntu, as it turns out, is not capable of mounting a share based on just the server name and credentials. It requires the share name along with the above information , at which point it will mount that share. Browsing shares based on just a user name and password is right out. Worse yet, if you don’t know this and attempt to do it Mac-style, you’ll get one of the most cryptic error messages I have ever seen: “Can't display location "smb://<removed>/", No application is registered as handling this file.” This tells you nothing about what the problem actually is. It’s poor design from a usability standpoint, and even worse error handling.

Unfortunately the story doesn’t end here. Ideally all applications would work as well with files on a network share as they would a local drive, but that’s not always the case – often the problem is that it’s harder to browse for a network shared file than a local file from inside an application. For this reason I have all of my common shares mapped as drives on Windows (this also saves effort on logging in) and Mac OS X takes this even further and immediately maps all mounted shares as drives. So I wanted to do the same for Ubuntu, and have my common shares automount as drives.

Nautilus, which transparently accesses SMB shares, is of no help here, because by transparently accessing SMB shares it doesn’t mount them in a standard way. The mount point it uses is inside of a hidden directory (.gvfs) that some applications will ignore. The ramifications of this being that most applications that are not a GTK application cannot see shares mounted by Nautilus, because they can’t see the mounted share that GTK tells its applications about, nor can they see the hidden mount point. The chief concern in my case was anything running under Wine, along with VLC.

The solution is not for the faint of heart. Highlights include additional software installations, manually backing up files, and a boatload of archaic terminal commands – and that’s just if everything goes right the first time. I love the terminal but this is ridiculous. Once it’s finished and set up correctly it gets the job done, but it’s an unjust amount of effort for something that can be accomplished in a matter of seconds on Windows or Mac OS X. This was easily the lowest point I reached while using Ubuntu.

The other thing I am going to throw in this category is mounting ISO images. I keep ISOs of all of my software for easy access. Interestingly enough, Ubuntu has the file system driver necessary to mount ISOs, but not a GUI application to do this. While it would be nice to have all of that built-in (ala Mac OS X) that’s not the flaw here – I’m perfectly content downloading a utility like I do for Windows (Daemon Tools). The flaw here was the Ubuntu GUI application for this, Gmount-ISO, can’t mount ISOs off of a SMB share. Worse yet, it doesn’t tell you this either.

The first time around, the only solution I was able to find was an another archaic CLI command that involved running the mount command by hand, in the style of “mount file.iso /cdrom -t iso9660 -o loop”. This was a terrible solution.

It wasn’t until some time later that I finally found a better solution. An application that wasn’t in the Ubuntu repository, AcetoneISO, can properly mount files off of SMB shares. Better yet it’s a bit closer to Daemon Tools functionality, since it can mount BIN/CUE, NRG (Nero Image), and MDF images.

I throw this in “terribly, terribly wrong” column because the solution was completely non-obvious. If you search for “Ubuntu Hardy mount iso” or something similar, AcetoneISO is nowhere near the top of the results, and the Ubuntu package repository is of no help. What’s in the repository is the aforementioned useless Gmount-ISO, and what’s at the top of Google’s results are Gmount-ISO and instructions to mount the image via CLI. It’s a success story in the end, but it was uncomfortably painful getting there.

If there’s any consolation in these matters, it’s that these were the only two issues that made me outright stop using Ubuntu, and go back to Windows for the day. Any other problems I had were significantly less severe than this.

Applications: Everything Else Things That Went Right
Comments Locked

195 Comments

View All Comments

  • brennans - Sunday, August 30, 2009 - link

    I use both XP64 and Hardy (Ubuntu 8.04).
    I am also a power user.

    Both these operating systems have pros and cons.

    Cons for XP64:
    1. It does not recognize my hardware properly.
    2. Finding 64 bit drivers was/is a mission.

    Cons for Hardy:
    1. It does not plug and play with my hardware (i have to compile the drivers).
    2. Not as user friendly as windows.

    Pros for XP64:
    1. Windowing system is super fast.
    2. User friendly.

    Pros for Hardy:
    1. Recognizes my hardware.
    2. Command line tools are awesome.

    Conclusion:
    I think that the article was good.

    I am one of those people who has always had problems installing windows straight out of the box and thus find that paying a large amount of money for their buggy OS is unacceptable.

    I can get a lot of stuff done with Hardy and it is free and if I find a problem with it I can potentially fix that problem.

    I also find it unacceptable that manufacturers do not write software (drivers or application software for their devices) for Linux.

    For me, it is difficult to live without both XP64 and Hardy.



  • ciukacz - Sunday, August 30, 2009 - link

    http://www.iozone.org/">http://www.iozone.org/
  • JJWV - Sunday, August 30, 2009 - link

    How can people use something like Aero and its Linux or OSX equivalents (that pre-dates it if I am not mistaken) ? The noise is just hiding the information. Transparency is one issue, another are those icons that are more like pictures : one looses the instant recognition. With Aero knowing which is the active window is not something obvious, you have to look at small details. The title of the window is surrounded by mist making it more difficult to read. Even with XP the colour gradient in a title bar is just noise : there is no information conveyed by it.

    The OS GUIs are more and more looking like those weird media players, with an image of rotary button that is to be manipulated like a slide button.

    The evolution of all applications to a web interface reminds me of the prehistory of personal computers : each program has its own interface.

    The MS Office Ribbon UI is just in the same vein: more than 20 icons on each tab. The icon interface is based on instant recognition and comprehension, when you have so many it turns into a mnemonics exercise. And of course with MS one does not have a choice : you just have to adapt to the program. An end user is only there to be of service to the programs ;-)

    If i want to look at a beautiful image I will do it, but the when I want to write an letter or update a database all those ultra kitsch visual effects are just annoying.

    As a summary the noise is killing the information and thus the usability.
  • Ronald Pottol - Saturday, August 29, 2009 - link

    The thing with windows has been seen before, back in the win 3.1-OS/2 days it was found that while one instance of excel didn't run any faster under OS/2, two in separate VMs (ok, not technically the same thing) ran in about the same time as one on windows.

    I like the package management, and hate when I have to install something that doesn't support it, it means I have to worry about updates all my self. If they have one, they I get updates every time I check for Ubuntu updates, very handy. Nice to get the nightly Google Chrome builds, for instance (still alpha/very beta).

    Frankly, supporting binary kernel drivers would be insane. Now they are stuck supporting code they cannot look at and cannot fix, they cannot fix their mistakes (or are stuck emulating them forever). If they supported them, there would be even more of them, and when they wanted to fix something broken or that was a bad idea, they would have to wait a reasonable amount of time before doing it, so it would be supported. Frankly, I don't see why people don't have automated frameworks for doing this and automated deb/rpm repository generation. I add their repository, when I get a kernel update, perhaps it is held up a day for their system to automatically build a new version, but then it all installs, instead, I am stuck with having to run a very old kernel, or not having 3D on my laptop, for instance.
  • cesarc - Sunday, August 30, 2009 - link

    I found this article very interesting, because is oriented to windows user and is helpful to them because you just didn't die trying it.
    But you can't blame ubuntu (or any distro), about the pain in the ass a video card's drive could be to install, blame ati and nvidia for been lazy, and if using wine for playing games is not as good as playing in windows blame games company for don't release a GNU/linux version.
    Also, the thing about why GNU/linux overpass windows in file management is because ntfs is a BAD file system, maybe if windows somehow could run under ext3 would be even better than it is.
    And why your negligence to use a console (stop saying cli please), you are not opening your mind trying to use GNU/linux as a windows just because it is not windows is a completely different os. Look from this point of view... something that you can do in windows with 5 clicks maybe you can do it in GNU/linux in just one line of bash code. So, sometimes you will use GUI and others you will use console and you will find that having this options is very comfortable. So start using the console and do the same article a year later.
    I hope some day have a paid version of GNU/linux (still open source), that could pay salaries to programers to fix specific issues in the OS.
    In the other hand, when you do the IT benchmark is very disappointing that you don't use linux with those beautiful Xeons. Servers environment is were GNU/linux get stronger. And Xeons with windows are just toys compared with unix on sparcs or power architectures.

    PS: try to get 450 days of uptime in a windows 2003.
  • rkerns - Saturday, August 29, 2009 - link

    Ryan,

    Thanks for your good work.

    Many people considering linux are still on dial-up. These are often folks with lesser expertise who just want to get connected and use their computer in basic ways. But getting connected with dial-up is something of an adventure with many distros and/or versions. Ubuntu 9.04 has moved away from easy dial-up, but Mint7KDE includes KPPP for simple dial-up connection. Mint7KDE has other nice features as well.

    I am asking you to expand your current picture of the landscape to include people who want to use linux with a dial-up connection. This of course would have to include a brief discussion of 1) appropriate modems and 2) distro differences. Thanks,
    r kerns
  • William Gaatjes - Saturday, August 29, 2009 - link

    GRATIS

    hijg hijg hijg hijg

    hijg hijg hijg hijg
  • lgude - Saturday, August 29, 2009 - link

    Really glad to find this in depth article after all this time. Thank you Ryan. I too have run Ubuntu as my main OS even though most of my experience is in Windows and have had similar experiences. Because this was a very long article it got into detail about things like the Package Manager or the multiple desktops that I have not seen discussed elsewhere from a user perspective. As someone else pointed out it is moot what people would like or complain about if they were moving from Linux to Windows or OSX, but imagine for a moment if they were used to getting the OS and all their apps updated in one hit and were asked to do it one app at a time and expected to pay for the privilege!

    If you go on with the Linux series I'd like to see discussion of the upcoming Ubuntu and other distros - I've been impressed with SUSE. I'd also like to see projects on how to build a Linux server and HTPC - including choice of distro and the kind of hardware needed. I'm less sure of where benchmarking is really useful - the tradition of detailed benchmarking at AT arose from the interest in overclocking and gaming which I think is a much lesser consideration in Linux. More relevant might be comparisons of netbook specific distros or how to work out if that old P4 will do as a home server. There is a lot of buzz in the tech world about things like Symbion, Chrome OS, Moblin, Maemo on portable devices that could possibly draw new readers to the Linux tab at AT. A great start in any case.
  • jmvaughn - Saturday, August 29, 2009 - link

    I just wanted to say thank you to the author for a very thorough article. After reading it, I decided to use Ubuntu for a PC I'm building out of spare parts for a retired friend who's on fixed income. My friend just uses web, e-mail, and some word processing, so this will be perfect.

    The article gave me a good idea of what to expect -- a good honest appraisal with all the good and bad. After installing Ubuntu 9.0.4, I am very impressed. The install was very quick, and easier than XP. Everything is quite snappy, even though it's running on a AMD 3800+ single core processor and an old hard drive.

  • xchrissypoox - Saturday, August 29, 2009 - link

    I only skimmed the article (I saw the part on gaming being poor), I'd like to see a comparison of several games using the same hardware on windows and linux (results given in fps). If this has been mentioned sorry and good day.

Log in

Don't have an account? Sign up now