Browsing by Author "Sonlu, Sinan"
Now showing 1 - 7 of 7
- Results Per Page
- Sort Options
Item Open Access An augmented crowd simulation system using automatic determination of navigable areas(Elsevier Ltd, 2021-04) Doğan, Yalım; Sonlu, Sinan; Güdükbay, UğurCrowd simulations imitate the group dynamics of individuals in different environments. Applications in entertainment, security, and education require augmenting simulated crowds into videos of real people. In such cases, virtual agents should realistically interact with the environment and the people in the video. One component of this augmentation task is determining the navigable regions in the video. In this work, we utilize semantic segmentation and pedestrian detection to automatically locate and reconstruct the navigable regions of surveillance-like videos. We place the resulting flat mesh into our 3D crowd simulation environment to integrate virtual agents that navigate inside the video avoiding collision with real pedestrians and other virtual agents. We report the performance of our open-source system using real-life surveillance videos, based on the accuracy of the automatically determined navigable regions and camera configuration. We show that our system generates accurate navigable regions for realistic augmented crowd simulations.Item Open Access Conversational agent expressing ocean personality and emotions using laban movement analysis and nonverbal communication cues(2019-08) Sonlu, SinanConversational human characters are heavily used in computer animation to convey various messages. Appearance, movement and voice of such characters in uence their perceived personality. Analyzing different channels of human communication, including body language, facial expression and vocalics, it is possible to design animation that exhibit consistent personality. This would enhance the message and improve realism of the virtual character. Using OCEAN personality model, we design internal agent parameters that are mapped into movement and sound modi ers, which in turn produce the nal animation. Laban Movement Analysis and Nonverbal Communication Cues are used for the operations that output bone rotations and facial shape key values at each frame. Correlations between personality and spoken text, and relations between personality and vocal features are integrated to introduce compherensive agent behavior. Multiple animation modi cation algorithms and a personality based dialogue selection method is introduced. Resulting conversational agent is tested in different scenarios, including passport check and fastfood order. Using a speech to text API user controls the dialog ow. Recorded interactions are evaluated using Amazon Mechanical Turk. Multiple statements about agent personality are rated by the crowd. In each experiment, one personality parameter is set to an extreme while others remain neutral, expecting an effect on perception.Item Open Access A conversational agent framework with multi-modal personality expression(Association for Computing Machinery, 2021-02) Sonlu, Sinan; Güdükbay, Uğur; Durupınar, FundaConsistently exhibited personalities are crucial elements of realistic, engaging, and behavior-rich conversational virtual agents. Both nonverbal and verbal cues help convey these agents’ unseen psychological states, contributing to our effective communication with them. We introduce a comprehensive framework to design conversational agents that express personality through non-verbal behaviors like body movement and facial expressions, as well as verbal behaviors like dialogue selection and voice transformation. We use the OCEAN personality model, which defines personality as a combination of five orthogonal factors of openness, conscientiousness, extraversion, agreeableness, and neuroticism. The framework combines existing personality expression methods with novel ones such as new algorithms to convey Laban Shape and Effort qualities. We perform Amazon Mechanical Turk studies to analyze how different communication modalities influence our perception of virtual agent personalities and compare their individual and combined effects on each personality dimension. The results indicate that our personality-based modifications are perceived as natural, and each additional modality improves perception accuracy, with the best performance achieved when all the modalities are present. We also report some correlations for the perception of conscientiousness with neuroticism and openness with extraversion.Item Open Access Human movement personality detection parameters(IEEE, 2024-06-23) Sonlu, Sinan; Dogan, Yalım; Ergüzen, Arçin Ülkü; Ünalan, Musa Ege; Demirci, Serkan; Durupinar, Funda; Güdükbay, UğurIn this study, we develop a system that detects apparent personality traits from animation data containing human movements. Since the datasets that can be used for this purpose lack sufficient variance, we determined labels for the samples in two datasets containing human animations, in terms of the Five Factor Personality Theory, with the help of a user study. Using these labels, we identified movement parameters highly dependent on personality traits and based on Laban Movement Analysis categories. The artificial neural networks we trained for personality analysis from animation data show that models that take the motion parameters determined in the study as input have a higher accuracy rate than models that take raw animation data as input. Therefore, using the parameters determined in this study to evaluate human movements in terms of their personality traits will increase the systems' success.Item Open Access Personality expression in cartoon animal characters using Sasang typology(Wiley, 2023-05-01) Mailee, Hamila; Sonlu, Sinan; Güdükbay, UğurThe movement style is an adequate descriptor of different personalities. While many studies investigate the relationship between apparent personality and high-level motion qualities in humans, similar research for animal characters still needs to be done. The variety in animals' skeletal configurations and texture complicates their pose estimation process. Our affect analysis framework includes a workflow for pose extraction in animal characters and a parameterization of the high-level animal motion descriptors inspired by Laban movement analysis. Using a data set of quadruped walk cycles, we prove the display of typologies in cartoon animal characters, reporting the point-biserial correlation between our motion parameters and the Sasang categories that reflect different personalities.Item Embargo Personality perception in human videos altered by motion transfer networks(Elsevier Ltd, 2024-04) Yurtoğlu, Ayda; Sonlu, Sinan; Doğan, Yalım; Güdükbay, UğurThe successful portrayal of personality in digital characters improves communication and immersion. Current research focuses on expressing personality through modifying animations using heuristic rules or data-driven models. While studies suggest motion style highly influences the apparent personality, the role of appearance can be similarly essential. This work analyzes the influence of movement and appearance on the perceived personality of short videos altered by motion transfer networks. We label the personalities in conference video clips with a user study to determine the samples that best represent the Five-Factor model’s high, neutral, and low traits. We alter these videos using the Thin-Plate Spline Motion Model, utilizing the selected samples as the source and driving inputs. We follow five different cases to study the influence of motion and appearance on personality perception. Our comparative study reveals that motion and appearance influence different factors: motion strongly affects perceived extraversion, and appearance helps convey agreeableness and neuroticism.Item Open Access Towards understanding personality expression via body motion(2024-05-29) Sonlu, Sinan; Dogan, Yalim; Ergüzen, Arçin Ülkü; Ünalan, Musa Ege; Demirci, Serkan; Durupınar, Funda; Güdükbay, UğurThis work addresses the challenge of data scarcity in personality-labeled datasets by introducing personality labels to clips from two open datasets, ZeroEGGS and Bandai, which provide diverse full-body animations. To this end, we present a user study to annotate short clips from both sets with labels based on the Five-Factor Model (FFM) of personality. We chose features informed by Laban Movement Analysis (LMA) to represent each animation. These features then guided us to select the samples of distinct motion styles to be included in the user study, obtaining high personality variance and keeping the study duration and cost viable. Using the labeled data, we then ran a correlation analysis to find features that indicate high correlation with each personality dimension. Our regression analysis results indicate that highly correlated features are promising in accurate personality estimation. We share our early findings, code, and data publicly.