Last year at BUILD I got my first chance to try Hololens. That experience was very interesting, not only because of the potential of Augmented Reality, but because of the entire circus surrounding the device. Last year's Hololens sessions were at a different location, and the groups brought over had to lock up everything electronic. We could only do photos of a unit in a display case. Naturally when Microsoft announced yesterday that Hololens would start shipping to developers yesterday, this years' experience could never be as secret.

So when we got to the demo location, and were given keys for a locker, I was a bit taken aback. But it wasn’t anything as sinister this time, only a way to make sure there were no backpacks on the floor as tripping hazards, because this year’s untethered experience was really untethered.

That comes a bit later though. This year’s demo involved building and deploying a 3D app using Unity and Visual Studio, and each person doing the demo also got a coach to help solve any issues on the way. The Hololens unit was slightly different this year, but looking at it, it was remarkably similar to last year’s demo version. The one big change this year was very welcome: instead of having a person physically measure the inter-pupillary distance on your head (the distance between your pupils), the experience is now handled through software when you first put the headset on. There is a quick calibration that you can run and it sets your eye position based on some air tap gestures. It was very quick and easy, and the set walks you through everything required with voice and visual cues.

Afterwards we sat down building our apps. Since this was a demo for press, all of the coding was done ahead of time and we just had to walk through adding scripts in Unity to set up the demo. We would then build them, and deploy to a remote machine using the IP address of the Hololens.

The demo app itself was of an energy ball which, when locked to a location in space, would open up and show some nifty effects. The experience was very basic compared to what I would expect of the retail apps, but this was a simple demo and it worked well.

The very interesting bit was later on, when we linked our Hololens units with the other people in our six-person pods. This way all six people could interact with a single energy ball. People also got to choose an avatar which would float over their heads. That experience was pretty amazing. With very little setup, the holograms were truly linked to a single point that all people could see.

As part of this demo, my coach suggested I walk around the (very large) room and then look back. This was probably the most amazing part of the demo. After walking a hundred feet or more away, and around some tables and pillars, I looked back and the hologram was still floating exactly where I left it. The ability to really lock things to a location is really the one part that needs to be perfect for this experience to work, and they really nailed it. In addition, my pod mates were all around the room with avatars floating over their heads.

After a night to think about it, I want to summarize my thoughts after using the both the previous and latest incarnation of the Hololens. The field of view issue is still there, and is clearly not something they were able to address before they shipped to developers. I would explain it as something like a mid-sized television, in the 27-inch range, sitting a few feet away from you. My experience was better this time because there were fewer software issues, but the small field of view can certainly take some getting used to.

The hardware itself was very easy to put on and adjust, and it was fairly well balanced in that I never felt like the unit was heavier on the front where the lenses are. The adjustment is done with a wheel on the back, much like a welding helmet if you’ve ever seen one of those. The right side has buttons for volume, and the left side has buttons for brightness. I had to crank up the audio quite a bit because of the loud room we were in, and although the audio was spatial, it was hard to get a sense of that with the commotion going on during the demos. Meanwhile, although I don’t wear glasses, it looked like there would be no issues wearing glasses with the Hololens, and several of the other attendees seemed to have no issues putting the device on and using it with them.

The experience of AR is much different than VR. Because you are interacting with things in real space, you can easily move around without fear of tripping or walking into a wall. VR is able to offer much more complex graphics and immersion right now, but you are largely bound to a single location. The use cases for AR seem, to me, to be not necessarily the same as VR and both should easily be able to co-exist.

While going through my demo session, I asked my coach how to close the app we were running, and he showed me a “bloom” gesture which closes it. Once I did that, I was in another mode for the Hololens where I could see how it mapped out the physical world with polygons by tapping my finger in a direction. This was amazing and the Hololens did a great job on picking up everything in my area, including the people, with no issues.

Another bloom then put me back at the start screen. On the demo units, this was pretty sparse, but I was able to go into the settings and play around. I didn’t see anything special in there, otherwise the process of interacting with the menus was very simple and was very easy to get used to. From a UI aspect, the Hololens did very well.

Towards the end of the demo session we did some shooting of orbs which opened up a hole in the floor. Peering down into it, it really felt like this was something you didn’t want to step into. The holograms tend to be a bit translucent, but on this one in particular it was much more solid. There’s a lot of untapped potential here and I hope to get a chance to do some of the other demos they have here to get a better feel for that. The headset itself seemed to be near the edges of its processing power on the final demo though, which had a lot of not very complex polygons moving around, and the six people interacting. There was a lot of things to keep track of, as well as quite a few holograms flying around.

Afterwards, Microsoft told us that all of the code that we used in the demo, and all of the code used on the demos last year, is now available on GitHub to allow devs quicker access to code.

I think the Hololens is still one of the most interesting pieces of tech I’ve used in a long time. There is a lot of potential here for education, training, and even tasks like painting your house and trying different color samples. There are quite a few applications where this would work very well.

The hardware though still needs a bit of work. It is a bit bulky, and the lenses would not stay anchored to the spot in front of me where I set them, so I had to readjust. The field of view is also not very large, and this could be because the processing power is not as powerful as the tethered experiences of VR.

I look forward to seeing where this goes in the future. A lot of the pieces are already well done and on the software side, the experience is very good. With a bit better hardware, which will almost certainly come with time, this is going to be a very powerful tool from Microsoft.

Comments Locked

23 Comments

View All Comments

  • Stahn Aileron - Thursday, March 31, 2016 - link

    Okay, I hate to do this because I like the site, but I have to now...

    Is anyone on the editorial staff either:

    A) A native English speaker or
    B) An English major?

    I've noticed over the years (beginning around the time Anand started getting hands-off with the site) that the writing quality has slowly decline. It's as if the articles are written and given a 30-second editor review before being posted.

    Sentences run long and are awkward. In some case, they're actually hard to parse and understand without re-reading and breaking them up. I find myself re-reading phrases multiple times more often as time goes by. The writing isn't structured very well. I sometimes feel like I'm reading someone's speech pattern or un-edited thought process rather than a deliberate attempt at professional journalistic writing.

    I love the site for its content. It's just getting harder to understand some of the said content without some amount of confusion.
  • Magius - Thursday, March 31, 2016 - link

    A few suggestions:
    beginning around the time => around the time
    slowly decline => slowly declined
    In some case => In some cases
    hard to parse and understand => hard to comprehend
    "re-reading phrases multiple times" => reading phrases multiple times OR re-reading phrases
    "I love the site for its content. It's just getting harder to understand some of the said content without some amount of confusion." => "I love the site for its content. It's just getting harder to understand some of the recent articles without some amount of confusion."

    Happens to the best of us. Your post is valid but still you might want to reconsider the initial portion, as it might be offensive to some. I do agree that the article leans towards a stream of thought style. Perhaps it is because of my continuous perusal of engineering write-ups that I do not mind as much. I have seen worse, much worse.
  • Stahn Aileron - Friday, April 1, 2016 - link

    And I stand corrected. This is what happens when I kind of rush through and don't copy-edit my own work properly. (Guess that sort of proves the point ^_^) I've seen much worse as well. (Fairly recently, too, at school.) Ryan Smith was kind enough to reply. I've replied in turn below.

    I do admit the start of my comments came off harsh. It was the product of seeing the issue for a couple of years. Ryan was still willing to address it though, thankfully. I thank both you and Ryan for the feedback.
  • nandnandnand - Thursday, March 31, 2016 - link

    "Microft then said that all of the code that we used in the demo, and all of the code used on the demos last year, is all available on GitHub to allow devs quicker access to code."

    ^ Perpetuating the problem by giving them free editing.
  • Ryan Smith - Thursday, March 31, 2016 - link

    Yes, virtually the entire staff is native English speakers. As for English majors, I find that teaching them tech is harder than glaring at the technical staff until their English improves.;-)

    Anyhow, while we always strive for the highest quality, much of the time we're working on very short deadlines. This means that there isn't as much time for copy editing as we'd like. We do the best we can, but we have to strike a balance between speed and quality. A poor article is a poor article, and a late article is a late article; neither one is very useful.

    Anyhow, I've gone ahead and reworked this article to something that you should find more enjoyable. And though I don't necessarily have the response you'd like to hear, I appreciate the feedback all the same.
  • Sunrise089 - Friday, April 1, 2016 - link

    Ryan,

    First, thank you, that was an above-and-beyond reply to the OP.

    I always worry though about remarks like 'we can be either sloppy or late and both are bad' since they seem to represent a change in Anand's 'don't be cable news' motto. I feel like slightly late is absolutely better than sloppy, and furthermore that it's not a binary proposition, since you could also be quick+thorough on fewer topics for example.

    @OP - One element of this is the site let the author tasked with editing other articles depart a while back. Presumably the though it made sense to invest those resources instead in more content at potentially lower quality.
  • Ryan Smith - Friday, April 1, 2016 - link

    "I always worry though about remarks like 'we can be either sloppy or late and both are bad' since they seem to represent a change in Anand's 'don't be cable news' motto."

    This is something we've always had to balance. In situations where we're crunched for time, when we do it right our articles are just clean enough to pass muster, and just soon enough not to be entirely too late. Otherwise if we have enough time, quality is always goal #1.
  • Stahn Aileron - Friday, April 1, 2016 - link

    Ryan,

    The direct reply is very appreciated. I understand the issue of deadlines and such. (I had one recently with school.) I'm guessing the staff is just getting spread to thin as of late. Thinking back on it, I do suppose AnandTech has expanded coverage. I guess it could be taking its toll on you guys.

    I still like the content, I'll be coming back for the foreseeable future. It is nice to see staff members listening to the audience. You have my gratitude.
  • Brett Howse - Saturday, April 2, 2016 - link

    Just want to say thanks to Ryan for cleaning this up. I was working on little sleep at Build and trying to get this done before the keynote started on day 2 - which I didn't quite make so it was finished while listening to the keynote. I was a bit distracted and next time I'll make sure to give it a couple of read-throughs before posting.
  • CaedenV - Thursday, March 31, 2016 - link

    The holo lens really seems like the coolest tech in the AR/VR space with the single exception of why on earth it is a tetherless experience. I get that it is neat... but why? And why on a 1st gen device?
    Sure, nobody enjoys tethers. But the amount of processing available in my desktop, or even in my puny little untrabook is going to far out-strip what is available in this headset. And while the first few generations of this are going to be very niche products, I don't think it is a valid concern to have tethers tangling with people walking around a room. The advantage of having a full desktop or laptop providing the power for this device would allow for much greater complexity, a much lighter headset, no worries about battery life, and the ability to have a much greater field of view (which they have hinted at being a processing/power/expense limitation.

    I mean, perhaps there is a reason they want it to be an all-in-one unit... but if there is then they have not done a very good job at explaining it. Still, gen 3-4 of this tech a few years down the road with a wider field of view and longer battery life would be absolutely fantastic!

Log in

Don't have an account? Sign up now