Matthew Munson
May 2021
The Augmented Space Library 2.0 (ASL) has been updated to support live video streaming to and from AR, Desktop, and VR devices. This can be especially useful when implementing networked AR applications, as the real-world camera perspective can be sent to other users, giving them context as to where in the world virtual objects are located.
This video demo showcases some of the capabilities of video streaming from multiple AR devices to a connected PC. The video is a screen capture of the PC application, which displays the video stream from two AR devices on the bottom corners of the screen. An additional camera angle providing a view of the real-world scene without augmented reality is overlaid in the top left corner. Each camera is represented by its corresponding viewing frustum in the virtual space, which allows connected users to visualize the locations and view orientations of other players.
In the demo application, the PC player is able to place cubes along a grid to create a 3D shape. The connected AR devices can walk around and view this object as it’s being made, and their video stream gives the PC player context on where their virtual objects are being placed in the real-world. AR players can change the color of the cubes by tapping on them.