Boston Code Camp: Part 1

Microsoft Technology Center, Burlington, MA
On March 25, I attended Boston Code Camp 27 at the Microsoft Technology Center in Burlington, MA. Although I’ve never done any .NET programming, the schedule listed some open source and professional development tracks I thought looked interesting. Below is a list of the talks I attended along with my impressions of the ideas and presenters. Where possible I’ll link to any of the presentations.

Living Breathing Digital: Bending Reality with HoloLens

I’d seen the hype about Hololens online and really couldn’t pass up seeing this thing in real life. Jason Ioffe from BlueMetal did a great job handling a demonstration that wasn’t easy to pull off. Jason was wearing the Hololens, which was tethered to his Mac book, that was connected to the projector; allowing us to “see through his eyes.”

The Hololens is totally self contained and equipped with sensors and cameras that constantly scan the surroundings up to a distance of 20 meters. It’s not a bulky, or awkward looking headset like the VR visors we had in the 90’s. A HUD (which we couldn’t see) was super-imposed on the glass looking out at the real world. You use your finger as a mouse, fully extended to hover and then bent to click. I have to say, watching him scroll and click through phantasmal menus made me think of little Danny in “The Shining”.

Jason took a “snapshot” of the room and loaded it into the Unity 3D SDK where all the walls, desks and audience members appeared as a 3D, gray, video game scene. Into this scene he imported “artifacts” from the Unity library like a sphere, a revolving doughnut, apple tree and a cartoon robot that acted as an interface to a speech to text virtual assistant.

Loading these artifacts back into the Hololens was were the presentation bogged down. It’s hard to keep an audience enraptured while you all sit there watch the compile, deploy, debug ritual play out in real time. Once he was all set we watched him shake the Apple tree. Digital apples fell from the tree, landed on a front row desk and then rolled onto the floor. Looking at the blue cartoon robot, who sat under the tree, he asked it the weather. The robot replied with a text bubble. While the artifacts could “interact” with the scene, they cannot, currently deal with changes in the environment. For example if the digital apple had landed on the desk, then I moved the desk, the apple wouldn’t follow the desk or drop to the floor but remain suspended.

The Hololens is a cool piece of tech and its sleek design holds up as a testament to Moore’s Law but outside of video games or niche specialized presentations (medical, miniaturization design) I have a hard time thinking of what practical applications it would be put to.