Developing for HoloLens using Unity

While attending Build I got the opportunity to try out developing a few simple things for the HoloLens. In this post I will let you know my takeaways from the experience.

A quick demo to start things of

First thing we got to do was test a finished app for the HoloLens. The app, using a radio car, showed off how the HoloLens could have a holographic object interact and use pathfinding in the real world.

What you did was that you looked at a spot in the room, did an “airtap”, and in the spot a flag showed up that your little car would try to get to it by navigating the room. While the car was driving around one also noticed how the audio of the HoloLens worked with the position of the car. Closing your eyes and finding the car by hearing alone should not be a problem.

During this demo we all noted some of the possibilities but also the limitations of HoloLens. More about this in the end of the post.

Developers Unite!

Armed with some idea of what we would be capable of achieving, we launched Unity. If you ever used Unity before, you are more or less a full-fledged HoloLens developer already. Just skip the game world and place your stuff in the physical world around the player instead.
If you haven’t checked out Unity, think 3D-models with code snippets linked to them.

Honestly, the thought that entered my mind first were along the lines of “This is almost too easy”. Obviously this was in part due to being led through the steps, but the lens in itself felt natural to integrate with. Most things “just works”, the HoloLens is bound to a camera in Unity and off you go.

Take a break? Pfft, yea right!

Halfway through the session it was time for a well-earned break. Being the nerd I am, this time was spent digging through the code, testing a few things. Not knowing how long it might take until I might get my hands on one of these toys again, were there really any other option?

First person shooter

When digging through the API I noticed a few references to the hand of the user. The Position and direction to the hand were things that you could find out. So with this I figured that I could place a BFG on my hand and start blasting away. The result? Not as good as I had hoped.

One of the major limitations of the HoloLens is where in your view the Holograms can show up. I would say based on my own experience that the holograms start roughly 1 meter in front of you, going 5 meters in depth with the box getting wider the further down you look.

As you might imagine you need to hold your hand out quite a bit to have an object appear on top of it. I heard something about a virtual pet app that would let you play with a virtual pet. Not sure how they deal with the limitation, might have been that I missed something.

Augment that!

During the session we got to turn on a setting that showed a wireframe over the real world. Noting where the lens had identified objects that were to be interacted with.

We could then place items on the surfaces that the HoloLens had found, resulting in some issues when the things that we used to place items on were themselves moving objects, such as other people. With the wireframe on, we could make sense of some of the glitches that were noted with the radio car getting stuck in some invisible wall and so on. It looks like there is quite a bit of caching going on, so when a person moves out of the way, the lens remembers the object for some time.

Go gadget go!

During my semi successful test with rendering things on my hand I asked about interacting with other things. Like the MS-Band for instance. I got the reply that the HoloLens had Bluetooth capability, so using gadgets in combination with HoloLens should be possible. I didn’t find much in the limited code that I got access to, but it ought to be like any other Bluetooth interaction using Windows 10 APIs.

Personally I consider this to be a pretty neat thing. With no limit to the sensors that are built into the HoloLens your imagination is the limit. Perhaps you could build a scary house that gets scarier when your pulse is racing?

Reflection time

HoloLens to me, feels like a great way of exploring new technical solutions. Interacting with the world the user sees allows for quite a few intriguing opportunities. However, there are some limitations that needs to be considered.

Using the HoloLens for things that relies on precision with the real world interaction would probably be a bad idea for now. With invisible walls showing up every now and then, it might be tricky to get the user to understand why something is stuck mid-air.

Other than precision of the scan, the limited view that can be used for rendering limits you in the way of immersion. So creating a live roleplaying application would probably be kind of “meh”. Another limitation that I didn’t manage to get an answer to is the battery life. This could be a deal breaker for some of the possible applications, but we will just have to wait and see for now.

In the end, for a first edition, pre-release piece of hardware, I was surprised in a good way. Sure there are some limitations and glitches, but in the end the experience is solid and enables us to create experiences for our users that earlier were not possible. Hopefully we will all get a chance to create cool things for HoloLens in the near future.

This Post Has One Comment

  1. As a teaching tool this is going to awesome. There is so much potential for virtually any field of work or study. can’t wait to get my hands on it…

Leave a Reply

Close Menu