Michael Tanaya
Master of Science Thesis Project, June 2017
Recent advances in technology enabled a new generation of Augmented Reality (AR) development where the near seamless integration and interaction between computer generated virtual and real-world physical objects can be achieved in real-time. This maturation of AR technology resulted in consumer-level commercial AR products. This products support user mobility in the physical world in two ways: the AR head mounted solution in the Microsoft HoloLens, and the handheld mobile device solution in the Google Tango. The head mounted solution has two main advantages: the immersive experience with stereoscopic display, and hands-free interaction with the AR world.
The HoloLens supports object manipulation with Natural User Interface (NUI), with the inherent NUI shortcomings of lacking in precision and intuition. This project investigates approaches to remedy these shortcomings by integrating other user interface paradigms. The availability of user’s free hands and the limited display real-estate of the head mounted display suggested exploration of solutions based on Tangible User Interface (TUI) devices that can support the display of Graphical User Interface (GUI) widgets. This thesis explores the potentials of linking a popular mobile phone as a TUI device for HoloLens, designs GUI for the phone display, adopts results from related user interaction research, and implements a novel solution to overcome the HoloLens UI shortcomings. Results from user testing shows the novel approach supports a faster object manipulation in general, object size adjustments in particular, and a more efficient and effective object manipulation experience.