Jake Stewart
Bachelor of Science Capstone Project, August 2020
[Link to Capstone Presentation]
[Link to Project Source Code]
[Link to Final Project Report (Restricted Access) ]
[Link to Project Folder (Restricted Access) ]
This project is part of the solution in bridging the gap between physical and virtual collaboration. The goal is to allow remote users to interactively manipulate the virtual view of a room based on a few physical cameras set in place. Taking the images from each physical camera, our solution samples these images and creates a virtual view that looks like it is from a camera in a new position between the physical cameras. The user can manipulate the position that they want to look from to get different perspectives of the room.
I began working on the project after Daniel Smith had created the prototype that could create these virtual views. My goal was to take what had already been done, understand the system, improve the outcome, and optimize its performance with hopes of achieving real-time interaction. A significant portion of my time was spent building tools to visualize the states of the system, which helped me analyze the algorithm to better understand the system. This allowed me to diagnose errors and to implement appropriate solutions. Specifically, the major components that I worked on are as follows. Improving the camera calibration process so that the camera positions and rotations relative to each other are more accurate, creating a more precise alternative solution for synthesizing the virtual view, and optimizing the implementation.
Hand tune camera position configuration:
Example: 424×240 resolution, cameras close together (~20cm), and small difference in viewing angle (~5-10 degrees towards each other)
Here is a, probably, more practical example: 424×240 resolution, cameras farther apart (~60cm), left camera elevated (~10cm), and larger difference in viewing angle (~25 degrees towards each other)