Personality transfer in human animation: comparing handcrafted and data-driven approaches
Date
Authors
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
BUIR Usage Stats
views
downloads
Series
Abstract
The ability to perceive and alter personality traits in animation has significant implications for fields such as character animation and interactive media. Research and developments that use systematic tools or machine learning approaches show that personality can be perceived from different modalities such as audio, images, videos, and motions. Traditionally, handcrafted frameworks have been used to modulate motion and alter perceived personality traits. However, deep learning approaches also offer the potential for more nuanced and automated personality augmentation than handcrafted approaches. To address this evolving landscape, we compare the efficacy of handcrafted models with deep-learning models in altering perceived personality traits in animations. We examined various approaches for personality recognition, motion alteration, and motion generation. We developed two methods for modulating motions to alter OCEAN personality traits based on our findings. The first method is a handcrafted tool that modifies bone positions and rotations using Laban Movement Analysis (LMA) parameters. The second method involves a deep-learning model that separates motion content from personality traits. We could change the overall animation by altering the personality traits through this model. These models are evaluated through a three-part user study, revealing distinct strengths and limitations in both approaches.