Browsing by Subject "Volumetric deformation"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Item Open Access Virtual sculpting with advanced gestural interface(2013) Kılıboz, Nurettin ÇağrıIn this study, we propose a virtual reality application that can be utilized to design preliminary/conceptual models similar to real world clay sculpting. The proposed system makes use of the innovative gestural interface that enhances the experience of the human-computer interaction. The gestural interface employs advanced motion capture hardware namely data gloves and six-degrees-of-freedom position tracker instead of classical input devices like keyboard or mouse. The design process takes place in the virtual environment that contains volumetric deformable model, design tools and a virtual hand that is driven by the data glove and the tracker. The users manipulate the design tools and the deformable model via the virtual hand. The deformation on the model is done by stuffing or carving material (voxels) in or out of the model with the help of the tools or directly by the virtual hand. The virtual sculpting system also includes volumetric force feedback indicator that provides visual aid. We also offer a mouse like interaction approach in which the users can still interact with conventional graphical user interface items such as buttons with the data glove and tracker. The users can also control the application with gestural commands thanks to our real time trajectory based dynamic gesture recognition algorithm. The gesture recognition technique exploits a fast learning mechanism that does not require extensive training data to teach gestures to the system. For recognition, gestures are represented as an ordered sequence of directional movements in 2D. In the learning phase, sample gesture data is filtered and processed to create gesture recognizers, which are basically finite-state machine sequence recognizers. We achieve real time gesture recognition by these recognizers without needing to specify gesture start and end points. The results of the conducted user study show that the proposed method is very promising in terms of gesture detection and recognition performance (73% accuracy) in a stream of motion. Additionally, the assessment of the user attitude survey denotes that the gestural interface is very useful and satisfactory. One of the novel parts of the proposed approach is that it gives users the freedom to create gesture commands according to their preferences for selected tasks. Thus, the presented gesture recognition approach makes the human-computer interaction process more intuitive and user specific.