2018
Cross Reality Views Via an Unmanned Aerial Vehicle

Aaron Hitchcock
Master of Science Capstone Project, April 2018

[Report    Presentation]

Paper: A. Hitchcock(+) and K. Sung. ” Multi-view augmented reality with a drone,” in in Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, VRST 2018, Tokyo, Japan, November 28 – December 01, 2018, pp. 108:1–108:2.
Poster Conference-Presentation

The fields of Augmented Reality (AR) and Virtual Reality (VR) have a consistent focus on first person experiences. By utilizing wearable devices like the Microsoft HoloLens, Oculus Rift or Google Cardboard users can enter and interact with a virtual space. The Augmented Space Library (ASL) [1], developed by Cross Reality Collaboration Sandbox (CRCS) Research group [2], seeks to combine both physical and virtual spaces with virtual objects and allows multiple users, both local and remote to interact. The capabilities of both these technologies can be expanded by the use of a remote controlled camera allowing for the addition of third person or remote first person viewing. This project creates a system and corresponding API allowing for integration of these views into both AR and VR applications as well as the ASL system. The system gives a user the ability to navigate and explore a remote physical space in real time or see themselves and their surroundings in third person. From a functional perspective this requires the remote control of a highly maneuverable camera. This was achieved through the use of a Drone or UAV (Unmanned Aerial Vehicle). These new virtual views will allow for AR/VR interaction in new ways as all prior physical points of view were bound to the user.

ACM VRST: Conference Poster + Fun

Under supervision of Dr. Kelvin Sung. Division of Computing Software Systems at UW Bothell