Browsing by Subject "Gesture."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access A context aware approach for enhancing gesture recognition accuracy on handheld devices(2010) Yıldırım, Hacı MehmetInput capabilities (e.g. joystick, keypad) of handheld devices allow users to interact with the user interface to access the information and mobile services. However, these input capabilities are very limited because of the mobile convenience. New input devices and interaction techniques are needed for handheld devices. Gestural interaction with accelerometer sensor is one of the newest interaction techniques on mobile computing. In this thesis, we introduce solutions that can be used for automatically enhancing the gesture recognition accuracy of accelerometer sensor, and as a standardized gesture library for gestural interaction on touch screen and accelerometer sensor. In this novel solution, we propose a framework that decides on suitable signal processing techniques for acceleration sensor data for a given context of the user. First system recognizes the context of the user using pattern recognition algorithm. Then, system automatically chooses signal ltering techniques for recognized context, and recognizes gestures. Gestures are also standardized for better usage. In this work, we also present several experiments which show the feasibility and e ectiveness of our automated gesture recognition enhancement system.Item Open Access Virtual sculpting with advanced gestural interface(2013) Kılıboz, Nurettin ÇağrıIn this study, we propose a virtual reality application that can be utilized to design preliminary/conceptual models similar to real world clay sculpting. The proposed system makes use of the innovative gestural interface that enhances the experience of the human-computer interaction. The gestural interface employs advanced motion capture hardware namely data gloves and six-degrees-of-freedom position tracker instead of classical input devices like keyboard or mouse. The design process takes place in the virtual environment that contains volumetric deformable model, design tools and a virtual hand that is driven by the data glove and the tracker. The users manipulate the design tools and the deformable model via the virtual hand. The deformation on the model is done by stuffing or carving material (voxels) in or out of the model with the help of the tools or directly by the virtual hand. The virtual sculpting system also includes volumetric force feedback indicator that provides visual aid. We also offer a mouse like interaction approach in which the users can still interact with conventional graphical user interface items such as buttons with the data glove and tracker. The users can also control the application with gestural commands thanks to our real time trajectory based dynamic gesture recognition algorithm. The gesture recognition technique exploits a fast learning mechanism that does not require extensive training data to teach gestures to the system. For recognition, gestures are represented as an ordered sequence of directional movements in 2D. In the learning phase, sample gesture data is filtered and processed to create gesture recognizers, which are basically finite-state machine sequence recognizers. We achieve real time gesture recognition by these recognizers without needing to specify gesture start and end points. The results of the conducted user study show that the proposed method is very promising in terms of gesture detection and recognition performance (73% accuracy) in a stream of motion. Additionally, the assessment of the user attitude survey denotes that the gestural interface is very useful and satisfactory. One of the novel parts of the proposed approach is that it gives users the freedom to create gesture commands according to their preferences for selected tasks. Thus, the presented gesture recognition approach makes the human-computer interaction process more intuitive and user specific.