BUIR logo
Communities & Collections
All of BUIR
  • English
  • Türkçe
Log In
Please note that log in via username/password is only available to Repository staff.
Have you forgotten your password?
  1. Home
  2. Browse by Subject

Browsing by Subject "User interfaces (Computer systems)"

Filter results by typing the first few letters
Now showing 1 - 3 of 3
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Camera-based 3D interaction for handheld devices
    (2010) Pekin, Tacettin Sercan
    Using handheld devices is a very important part of our daily life. Interacting with them is the most unavoidable part of using them. Today’s user interface designs are mostly adapted from desktop computers. The result of this was difficulties of using handheld devices. However processing power, new sensing technologies and cameras are already available for mobile devices. This gives us the possibility to develop systems to communicate through different modalities. This thesis proposes some novel approaches, including finger detection, finger tracking and object motion analysis, to allow efficient interaction with mobile devices. As the result of my thesis, a new interface between users and mobile devices is created. This is a new way of interaction with the mobile device. It enables direct manipulation on objects. The technique does not require any extra hardware. The interaction method, maps an object’s motion (such as a finger’s or a predefined marker’s motion) to a virtual space to achieve manipulation which is moving in front of the camera. For Finger Detection, a new method is created based on the usage of the mobile devices and structure of thumb. A fast two dimensional color-based scene analysis method is applied to solve the problem. For Finger Tracking, a new method is created based on the movement ergonomics of thumb when holding the mobile device on hand. Extracting the three dimensional movement from the two dimensional RGB data is an important part of this section of the study. A new 3D pointer data and pointer image is created for usage with 3D input and 3D interaction of 3D scenes. Also direct manipulation for low cost is achieved.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Model-based camera tracking for augmented reality
    (2014) Aman, Aytek
    Augmented reality (AR) is the enhancement of real scenes with virtual entities. It is used to enhance user experience and interaction in various ways. Educational applications, architectural visualizations, military training scenarios and pure entertainment-based applications are often enhanced by augmented reality to provide more immersive and interactive experience for the users. With hand-held devices getting more powerful and cheap, such applications are becoming very popular. To provide natural AR experiences, extrinsic camera parameters (position and rotation) must be calculated in an accurate, robust and efficient way so that virtual entities can be overlaid onto the real environments correctly. Estimating extrinsic camera parameters in real-time is a challenging task. In most camera tracking frameworks, visual tracking serve as the main method for estimating the camera pose. In visual tracking systems, keypoint and edge features are often used for pose estimation. For rich-textured environments, keypoint-based methods work quite well and heavily used. Edge-based tracking, on the other hand, is more preferable when the environment is rich in geometry but has little or no visible texture. Pose estimation for edge based tracking systems generally depends on the control points that are assigned on the model edges. For accurate tracking, visibility of these control points must be determined in a correct manner. Control point visibility determination is computationally expensive process. We propose a method to reduce computational cost of the edge-based tracking by preprocessing the visibility information of the control points. For that purpose, we use persistent control points which are generated in the world space during preprocessing step. Additionally, we use more accurate adaptive projection algorithm for persistent control points to provide more uniform control point distribution in the screen space. We test our camera tracker in different environments to show the effectiveness and performance of the proposed algorithm. The preprocessed visibility information enables constant time calculations of control point visibility while preserving the accuracy of the tracker. We demonstrate a sample AR application with user interaction to present our AR framework, which is developed for a commercially available and widely used game engine.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Remediating the data : a study on the interactive dimensions in new media
    (2005) Şenova, Funda
    This thesis analyses the role of interface in altering perception and customizing interaction in new media. It comprehends the correlation of theory and practice while probing into the current debate by means of examples and case studies. The general structure of this research is based on the objectives of interactivity in cultural and social levels. In each level, interactivity is analyzed through function, operation oriented and design-wise aspects of new media. This study focuses on the interactive dimensions in new media and their affects on the user’s perception and engagement within a digitally framed work.

About the University

  • Academics
  • Research
  • Library
  • Students
  • Stars
  • Moodle
  • WebMail

Using the Library

  • Collections overview
  • Borrow, renew, return
  • Connect from off campus
  • Interlibrary loan
  • Hours
  • Plan
  • Intranet (Staff Only)

Research Tools

  • EndNote
  • Grammarly
  • iThenticate
  • Mango Languages
  • Mendeley
  • Turnitin
  • Show more ..

Contact

  • Bilkent University
  • Main Campus Library
  • Phone: +90(312) 290-1298
  • Email: dspace@bilkent.edu.tr

Bilkent University Library © 2015-2025 BUIR

  • Privacy policy
  • Send Feedback