Perceptually driven stereoscopic camera control in 3D virtual environments
Item Usage Stats
Depth notion and how to perceive depth have long been studied in the eld of psychology, physiology, and even art. Human visual perception enables to perceive spatial layout of the outside world by using visual depth cues. Binocular disparity among these depth cues, is based on the separation between two di erent views that are observed by two eyes. Disparity concept constitutes the base of the construction of the stereoscopic vision. Emerging technologies try to replicate binocular disparity principles in order to provide 3D illusion and stereoscopic vision. However, the complexity of applying the underlying principles of 3D perception, confronted researchers the problem of wrongly produced stereoscopic contents. It is still a great challenge to give realistic but also comfortable 3D experience. In this work, we present a camera control mechanism: a novel approach for disparity control and a model for path generation. We try to address the challenges of stereoscopic 3D production by presenting comfortable viewing experience to users. Therefore, our disparity system approaches the accommodation/convergence con- ict problem, which is the most known issue that causes visual fatigue in stereo systems, by taking objects' importance into consideration. Stereo camera parameters are calculated automatically with an optimization process. In the second part of our control mechanism, the camera path is constructed for a given 3D environment and scene elements. Moving around important regions of objects is a desired scene exploration task. In this respect, object saliencies are used for viewpoint selection around scene elements. Path structure is generated by using linked B ezier curves which assures to pass through pre-determined viewpoints. Though there is considerable amount of research found in the eld of stereo creation, we believe that approaching this problem from scene content aspect provides a uniquely promising experience. We validate our assumption with user studies in which our method and existing two other disparity control models are compared. The study results show that our method shows superior results in quality, depth, and comfort.