Computer vision based behavior analysis

Date

2009

Editor(s)

Advisor

ƖzgĆ¼ler, A. BĆ¼lent

Supervisor

Co-Advisor

Co-Supervisor

Instructor

Source Title

Print ISSN

Electronic ISSN

Publisher

Bilkent University

Volume

Issue

Pages

Language

English

Journal Title

Journal ISSN

Volume Title

Series

Abstract

In this thesis, recognition and understanding of behavior based on visual inputs and automated decision schemes are investigated. Behavior analysis is carried out on a wide scope ranging from animal behavior to human behavior. Due to this extensive coverage, we present our work in two main parts. Part I of the thesis investigates locomotor behavior of lab animals with particular focus on drug screening experiments, and Part II investigates analysis of behavior in humans, with specific focus on visual attention. The animal behavior analysis method presented in Part I, is composed of motion tracking based on background subtraction, determination of discriminative behavioral characteristics from the extracted path and speed information, summarization of these characteristics in terms of feature vectors and classification of feature vectors. The experiments presented in Part I indicate that the proposed animal behavior analysis system proves very useful in behavioral and neuropharmacological studies as well as in drug screening and toxicology studies. This is due to the superior capability of the proposed method in detecting discriminative behavioral alterations in response to pharmacological manipulations. The human behavior analysis scheme presented in Part II proposes an efficient method to resolve attention fixation points in unconstrained settings adopting a developmental psychology point of view. The head of the experimenter is modeled as an elliptic cylinder. The head model is tracked using Lucas-Kanade optical flow method and the pose values are estimated accordingly. The resolved poses are then transformed into the gaze direction and the depth of the attended object through two Gaussian regressors. The regression outputs are superposed to find the initial estimates for object center locations. These estimates are pooled to mimic human saccades realistically and saliency is computed in the prospective region to determine the final estimates for attention fixation points. Verifying the extensive generalization capabilities of the human behavior analysis method given in Part II, we propose that rapid gaze estimation can be achieved for establishing joint attention in interaction-driven robot communication as well.

Course

Other identifiers

Book Title

Keywords

Citation

item.page.isversionof