Browsing by Subject "Blending"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Data-driven synthesis of realistic human motion using motion graphs(2014) Dirican, HüseyinRealistic human motions is an essential part of diverse range of media, such as feature films, video games and virtual environments. Motion capture provides realistic human motion data using sensor technology. However, motion capture data is not flexible. This drawback limits the utility of motion capture in practice. In this thesis, we propose a two-stage approach that makes the motion captured data reusable to synthesize new motions in real-time via motion graphs. Starting from a dataset of various motions, we construct a motion graph of similar motion segments and calculate the parameters, such as blending parameters, needed in the second stage. In the second stage, we synthesize a new human motion in realtime, depending on the blending techniques selected. Three different blending techniques, namely linear blending, cubic blending and anticipation-based blending, are provided to the user. In addition, motion clip preference approach, which is applied to the motion search algorithm, enable users to control the motion clip types in the result motion.Item Open Access Editing heightfield using history management and 3D widgets(IEEE, 2009-09) Yalçın, M. Aydın; Çapin, Tolga K.In virtual environments, terrain is generally modeled by heightfield, a 2D structure. To be able to create desired terrain geometry, software editors for this specific task have been developed. The graphics hardware, data structures and rendering techniques are developing fast to open up new possibilities to the user and terrain editor functionalities are following such improvements (such as real-time lighting updates during editing operations and multi-texture blending). Yet, current terrain editors mostly fail to give the user feedback about their actions and also fail to help the users understand and undo the editing operations on the terrain. The aim of this study is to investigate the 3d-widget based visualization of possible editing (sculpturing) actions on terrain and to help user undo previous operations. © 2009 IEEE.Item Open Access Real-time parameterized locomotion generation(2008) Akbay, MuzafferReuse and blending of captured motions for creating realistic motions of human body is considered as one of the challenging problems in animation and computer graphics. Locomotion (walking, running and jogging) is one of the most common types of daily human motion. Based on blending of multiple motions, we propose a two-stage approach for generating locomotion according to userspecified parameters, such as linear and angular velocities. Starting from a large dataset of various motions, we construct a motion graph of similar short motion segments. This process includes the selection of motions according to a set of predefined criteria, the correction of errors on foot positioning, pre-adjustments, motion synchronization, and transition partitioning. In the second stage, we generate an animation according to the specified parameters by following a path on the graph during run-time, which can be performed in real-time. Two different blending techniques are used at this step depending on the number of the input motions: blending based on scattered data interpolation and blending based on linear interpolation. Our approach provides an expandable and efficient motion generation system, which can be used for real time applications.