Example based retargeting human motion to arbitrary mesh models

buir.advisorÇapın, Tolga
dc.contributor.authorYaz, İlker O.
dc.date.accessioned2016-01-08T18:25:32Z
dc.date.available2016-01-08T18:25:32Z
dc.date.issued2013
dc.departmentDepartment of Computer Engineeringen_US
dc.descriptionAnkara : The Department of Computer Engineering and the Graduate School of Engineering and Science of Bilkent University, 2013.en_US
dc.descriptionThesis (Master's) -- Bilkent University, 2013.en_US
dc.descriptionIncludes bibliographical references leaves 51-55.en_US
dc.description.abstractAnimation of mesh models can be accomplished in many ways, including character animation with skinned skeletons, deformable models, or physic-based simulation. Generating animations with all of these techniques is time consuming and laborious for novice users; however adapting already available wide-range human motion capture data might simplify the process signi cantly. This thesis presents a method for retargeting human motion to arbitrary 3D mesh models with as little user interaction as possible. Traditional motion retargeting systems try to preserve original motion as is, while satisfying several motion constraints. In our approach, we use a few pose-to-pose examples provided by the user to extract desired semantics behind retargeting process by not limiting the transfer to be only literal. Hence, mesh models, which have di erent structures and/or motion semantics from humanoid skeleton, become possible targets. Also considering mesh models which are widely available and without any additional structure (e.g. skeleton), our method does not require such a structure by providing a build-in surface-based deformation system. Since deformation for animation purpose can require more than rigid behaviour, we augment existing rigid deformation approaches to provide volume preserving and cartoon-like deformation. For demonstrating results of our approach, we retarget several motion capture data to three well-known models, and also investigate how automatic retargeting methods developed considering humanoid models work on our models.en_US
dc.description.degreeM.S.en_US
dc.description.statementofresponsibilityYaz, İlker Oen_US
dc.format.extentxii, 55 leaves, illustrations, tablesen_US
dc.identifier.itemidB135590
dc.identifier.urihttp://hdl.handle.net/11693/15850
dc.language.isoEnglishen_US
dc.publisherBilkent Universityen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectMesh deformationen_US
dc.subjectVolume preservationen_US
dc.subjectAnimationen_US
dc.subjectMotion retargetingen_US
dc.subject.lccT385 .Y39 2013en_US
dc.subject.lcshComputer animation.en_US
dc.subject.lcshComputer graphics.en_US
dc.subject.lcshBody, Human--Digital techniques.en_US
dc.subject.lcshHuman locomotion--Computer simulation.en_US
dc.subject.lcshImage processing--Digital techniques.en_US
dc.subject.lcshThree-dimensional display systems.en_US
dc.titleExample based retargeting human motion to arbitrary mesh modelsen_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
0006554.pdf
Size:
14.09 MB
Format:
Adobe Portable Document Format