One of the activities I anticipated in coming out to Build 2015 was experiencing the HoloLens. I saw the announcement in January and watched the related video a number of times leading up to Build. Of course the hope was to have this as the developer ‘give-away’ which would be incredible and allow us to build for the device. And yes, as Alex stated during the keynote that “I have hundreds of these devices”, me and those around me grew very excited at what that might mean. Then the realization set in that hundreds would not be enough for all given the thousands of us here. Alex’s plan for these hundreds of devices was to provide sessions where attendees could review the devices in a couple of scenarios. I chose the Hands-on Developer option and within a couple of hours received an email with my acceptance into the Holographic Academy.
The development experience was a full 4.5 hours immersion into the world of Holographic development. I had my own workstation and HoloLens device attached by USB. The session followed a script designed to get you comfortable with both the tools and the device. After a short sample app to get used to the HoloLens, we moved on to recreating the ‘Origami’ app. The development takes place equally in Unity and Visual Studio.
Unity is a development tool very popular in game development. It provides a 3D area for placing objects, lights, camera and defining the various actions for these items. The assets we needed for our application were provided in our workstations and it was a matter of placing them in the correct location in the object tree. This part reminded me a little of xaml development in Blend. Actions for the objects were defined by creating C# classes and dragging them onto the object.
The C# code is modified using Visual Studio. The code is there to look for events or periodically get the state of the HoloLens view, which reminds me of game or XNA development. See if my world has change and do something in response to that change. The lab provided the code to include the classes so we could spend time seeing the implementation in the device. However, there was flexibility in how we coded certain components, especially in the voice recognition. The lab code included standard phrases. However, we were encouraged to include our own text and make it respond to that. This worked quite well without any ‘voice training’ for the device and with other people around interacting with their own devices.
After the objects are bound and the code modified, the Unity application builds a Visual Studio solution and places it in a folder via a custom add-in to Unity, which is not publically available at this time. The solution is opened in Visual Studio and the application deployed to the USB connected HoloLens device. It is important the HoloLens is held while the application is launched so that it points to the location where it will be used as the code specified placing the object x number of inches in front of the device. Once deployed, the USB can be removed. Now untethered, you can view the app in 3D while walking around as well as closer and farther away. This type of movement around the visual is accentuated by the spatial sound enabled in the device making the experience more realistic without any of the nausea/motion sickness sometimes experienced in other devices. I was also able to see and carry on conversation with members of the HoloLens Team present in the room.
The development experience was simplified for this short session, but I think this was enough to get a sense of the techniques needed to make this work. I am guessing that this type of development is only needed for the immersive, 3D applications. Universal Applications, which are also Holographic Applications, would just be created in the current development model in Visual Studio. I would also think that these would appear for placement from some kind of app store interface instead of the deployment scenario described here. This also gets into the question of the storage capabilities of the device which we did not discuss. I look forward to getting my hands on a device and seeing what it can do without a script.