BUIR logo
Communities & Collections
All of BUIR
  • English
  • Türkçe
Log In
Please note that log in via username/password is only available to Repository staff.
Have you forgotten your password?
  1. Home
  2. Browse by Subject

Browsing by Subject "Moving objects"

Filter results by typing the first few letters
Now showing 1 - 5 of 5
  • Results Per Page
  • Sort Options
  • No Thumbnail Available
    ItemOpen Access
    HMM based method for dynamic texture detection
    (IEEE, 2007) Töreyin, Behçet Uğur; Çetin, A. Enis
    A method for detection of dynamic textures in video is proposed. It is observed that the motion vectors of most of the dynamic textures (e.g. sea waves, swaying tree leaves and branches in the wind, etc.) exhibit random motion. On the other hand, regular motion of ordinary video objects has well-defined directions. In this paper, motion vectors of moving objects are estimated and tracked based on a minimum distance based metric. The direction of the motion vectors are then quantized to define two threestate Markov models corresponding to dynamic textures and ordinary moving objects with consistent directions. Hidden Markov Models (HMMs) are used to classify the moving objects in the final step of the algorithm.
  • No Thumbnail Available
    ItemOpen Access
    Infrared digital holography applications for virtual museums and diagnostics of cultural heritage
    (SPIE, 2011) Paturzo, M.; Pelagotti, A.; Geltrude, A.; Locatelli, M.; Poggi P.; Meucci, R.; Ferraro P.; Stoykova, E.; Yaraş F.; Yöntem, A. Özgür; Kang H.; Onural, Levent
    Infrared digital holograms of different statuettes are acquired. For each object, a sequence of holograms is recorded rotating the statuette with an angular step of few degrees. The holograms of the moving objects are used to compose dynamic 3D scenes that, then, are optically reconstructed by means of spatial light modulators (SLMs) using an illumination wavelength of 532 nm. This kind of reconstruction allows to obtain a 3D imaging of the statuettes that could be exploited for virtual museums. © 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).
  • No Thumbnail Available
    ItemOpen Access
    Moving region detection in compressed video
    (Springer, 2004) Töreyin, B. U.; Çetin, A. Enis; Aksay, A.; Akhan, M. B.
    In this paper, an algorithm for moving region detection in compressed video is developed. It is assumed that the video can be compressed either using the Discrete Cosine Transform (DOT) or the Wavelet Transform (WT). The method estimates the WT of the background scene from the WTs of the past image frames of the video. The WT of the current image is compared with the WT of the background and the moving objects are determined from the difference. The algorithm does not perform inverse WT to obtain the actual pixels of the current image nor the estimated background. In the case of DOT compressed video, the DC values of 8 by 8 image blocks of Y, U and V channels are used for estimating the background scene. This leads to a computationally efficient method and a system compared to the existing motion detection methods. © Springer-Verlag 2004.
  • No Thumbnail Available
    ItemOpen Access
    Moving region detection in wavelet compressed video
    (IEEE, 2004) Töreyin, B. Uğur; Çetin, A. Enis; Aksay, Anıl; Akhan, M. B.
    In many vision based surveillance systems the video is stored in wavelet compressed form. In this study, an algorithm for moving object and region detection in video that is compressed using a wavelet transform (WT) is developed. The algorithm estimates the WT of the background scene from the WTs of the past image frames of the video. The WT of the current image is compared with the WT of the background and the moving objects are determined from the difference. The algorithm does not perform inverse WT to obtain the actual pixels of the current image nor the estimated background. This leads to a computationally efficient method and a system compared to the existing motion estimation methods.
  • No Thumbnail Available
    ItemOpen Access
    Oscillatory synchronization model of attention to moving objects
    (Elsevier, 2012) Yilmaz, O.
    The world is a dynamic environment hence it is important for the visual system to be able to deploy attention on moving objects and attentively track them. Psychophysical experiments indicate that processes of both attentional enhancement and inhibition are spatially focused on the moving objects; however the mechanisms of these processes are unknown. The studies indicate that the attentional selection of target objects is sustained via a feedforward-feedback loop in the visual cortical hierarchy and only the target objects are represented in attention-related areas. We suggest that feedback from the attention-related areas to early visual areas modulates the activity of neurons; establishes synchronization with respect to a common oscillatory signal for target items via excitatory feedback, and also establishes de-synchronization for distractor items via inhibitory feedback. A two layer computational neural network model with integrate-and-fire neurons is proposed and simulated for simple attentive tracking tasks. Consistent with previous modeling studies, we show that via temporal tagging of neural activity, distractors can be attentively suppressed from propagating to higher levels. However, simulations also suggest attentional enhancement of activity for distractors in the first layer which represents neural substrate dedicated for low level feature processing. Inspired by this enhancement mechanism, we developed a feature based object tracking algorithm with surround processing. Surround processing improved tracking performance by 57% in PETS 2001 dataset, via eliminating target features that are likely to suffer from faulty correspondence assignments. © 2012 Elsevier Ltd.

About the University

  • Academics
  • Research
  • Library
  • Students
  • Stars
  • Moodle
  • WebMail

Using the Library

  • Collections overview
  • Borrow, renew, return
  • Connect from off campus
  • Interlibrary loan
  • Hours
  • Plan
  • Intranet (Staff Only)

Research Tools

  • EndNote
  • Grammarly
  • iThenticate
  • Mango Languages
  • Mendeley
  • Turnitin
  • Show more ..

Contact

  • Bilkent University
  • Main Campus Library
  • Phone: +90(312) 290-1298
  • Email: dspace@bilkent.edu.tr

Bilkent University Library © 2015-2025 BUIR

  • Privacy policy
  • Send Feedback

We collect and process your personal information for the following purposes: Authentication, Preferences, Acknowledgement and Statistics.
To learn more, please read our
privacy policy.

Customize