BUIR logo
Communities & Collections
All of BUIR
  • English
  • Türkçe
Log In
Please note that log in via username/password is only available to Repository staff.
Have you forgotten your password?
  1. Home
  2. Browse by Subject

Browsing by Subject "Computer animation"

Filter results by typing the first few letters
Now showing 1 - 7 of 7
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Animation of human motion : an interactive tool
    (1991) Mahmud, Syed Kamran
    The goal of this work is the implementation of an interactive, general purpose, human motion animation tool. The tool uses parametric key-frame animation as the animation technique. DiflPerent abstractions of motion specification in key-frame generation are explored, and a new notion of semi goal-directed animation for generating key-frame orientations of human body is introduced to resolve the tradeoif between animator and machine burden in choosing a level of motion specification.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Conversational agent expressing ocean personality and emotions using laban movement analysis and nonverbal communication cues
    (2019-08) Sonlu, Sinan
    Conversational human characters are heavily used in computer animation to convey various messages. Appearance, movement and voice of such characters in uence their perceived personality. Analyzing different channels of human communication, including body language, facial expression and vocalics, it is possible to design animation that exhibit consistent personality. This would enhance the message and improve realism of the virtual character. Using OCEAN personality model, we design internal agent parameters that are mapped into movement and sound modi ers, which in turn produce the nal animation. Laban Movement Analysis and Nonverbal Communication Cues are used for the operations that output bone rotations and facial shape key values at each frame. Correlations between personality and spoken text, and relations between personality and vocal features are integrated to introduce compherensive agent behavior. Multiple animation modi cation algorithms and a personality based dialogue selection method is introduced. Resulting conversational agent is tested in different scenarios, including passport check and fastfood order. Using a speech to text API user controls the dialog ow. Recorded interactions are evaluated using Amazon Mechanical Turk. Multiple statements about agent personality are rated by the crowd. In each experiment, one personality parameter is set to an extreme while others remain neutral, expecting an effect on perception.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Data-driven synthesis of realistic human motion using motion graphs
    (2014) Dirican, Hüseyin
    Realistic human motions is an essential part of diverse range of media, such as feature films, video games and virtual environments. Motion capture provides realistic human motion data using sensor technology. However, motion capture data is not flexible. This drawback limits the utility of motion capture in practice. In this thesis, we propose a two-stage approach that makes the motion captured data reusable to synthesize new motions in real-time via motion graphs. Starting from a dataset of various motions, we construct a motion graph of similar motion segments and calculate the parameters, such as blending parameters, needed in the second stage. In the second stage, we synthesize a new human motion in realtime, depending on the blending techniques selected. Three different blending techniques, namely linear blending, cubic blending and anticipation-based blending, are provided to the user. In addition, motion clip preference approach, which is applied to the motion search algorithm, enable users to control the motion clip types in the result motion.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    A decision theoretic approach to motion saliency in computer animations
    (Springer, Berlin, Heidelberg, 2011) Arpa, Sami; Bülbül, Abdullah; Çapın, Tolga
    We describe a model to calculate saliency of objects due to their motions. In a decision-theoretic fashion, perceptually significant objects inside a scene are detected. The work is based on psychological studies and findings on motion perception. By considering motion cues and attributes, we define six motion states. For each object in a scene, an individual saliency value is calculated considering its current motion state and the inhibition of return principle. Furthermore, a global saliency value is considered for each object by covering their relationships with each other and equivalence of their saliency value. The position of the object with highest attention value is predicted as a possible gaze point for each frame in the animation. We conducted several eye-tracking experiments to practically observe the motion-attention related principles in psychology literature. We also performed some final user studies to evaluate our model and its effectiveness. © 2011 Springer-Verlag.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Personality transfer in human animation: comparing handcrafted and data-driven approaches
    (2024-09) Ergüzen, Arçin Ülkü
    The ability to perceive and alter personality traits in animation has significant implications for fields such as character animation and interactive media. Research and developments that use systematic tools or machine learning approaches show that personality can be perceived from different modalities such as audio, images, videos, and motions. Traditionally, handcrafted frameworks have been used to modulate motion and alter perceived personality traits. However, deep learning approaches also offer the potential for more nuanced and automated personality augmentation than handcrafted approaches. To address this evolving landscape, we compare the efficacy of handcrafted models with deep-learning models in altering perceived personality traits in animations. We examined various approaches for personality recognition, motion alteration, and motion generation. We developed two methods for modulating motions to alter OCEAN personality traits based on our findings. The first method is a handcrafted tool that modifies bone positions and rotations using Laban Movement Analysis (LMA) parameters. The second method involves a deep-learning model that separates motion content from personality traits. We could change the overall animation by altering the personality traits through this model. These models are evaluated through a three-part user study, revealing distinct strengths and limitations in both approaches.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Real-time simulation and visualization of deformations on heightfields
    (2010) Yalçın, M. Adil
    The applications of computer graphics raise new expectations, such as realistic rendering, real-time dynamic scenes and physically correct simulations. The aim of this thesis is to investigate these problems on the height eld structure, an extended 2D model that can be processed e ciently by data-parallel architectures. This thesis presents methods for simulation of deformations on height eld as caused by triangular objects, physical simulation of objects interacting with height eld and advanced visualization of deformations. The height eld is stored in two di erent resolutions to support fast rendering and precise physical simulations as required. The methods are implemented as part of a large-scale height- eld management system, which applies additional level of detail and culling optimizations for the proposed methods and data structures. The solutions provide real-time interaction and recent graphics hardware (GPU) capabilities are utilized to achieve real-time results. All the methods described in this thesis are demonstrated by a sample application and performance characteristics and results are presented to support the conclusions.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Saliency for animated meshes with material properties
    (ACM, 2010-07) Bülbül, Abdullah; Koca, Çetin; Çapin, Tolga; Güdükbay, Uğur
    We propose a technique to calculate the saliency of animated meshes with material properties. The saliency computation considers multiple features of 3D meshes including their geometry, material and motion. Each feature contributes to the final saliency map which is view independent; and therefore, can be used for view dependent and view independent applications. To verify our saliency calculations, we performed an experiment in which we use an eye tracker to compare the saliencies of the regions that the viewers look with the other regions of the models. The results confirm that our saliency computation gives promising results. We also present several applications in which the saliency information is used. © 2010 ACM.

About the University

  • Academics
  • Research
  • Library
  • Students
  • Stars
  • Moodle
  • WebMail

Using the Library

  • Collections overview
  • Borrow, renew, return
  • Connect from off campus
  • Interlibrary loan
  • Hours
  • Plan
  • Intranet (Staff Only)

Research Tools

  • EndNote
  • Grammarly
  • iThenticate
  • Mango Languages
  • Mendeley
  • Turnitin
  • Show more ..

Contact

  • Bilkent University
  • Main Campus Library
  • Phone: +90(312) 290-1298
  • Email: dspace@bilkent.edu.tr

Bilkent University Library © 2015-2025 BUIR

  • Privacy policy
  • Send Feedback