Computer vision based behavior analysis

buir.advisorÖzgüler, A. Bülent
dc.contributor.authorYücel, Zeynep
dc.date.accessioned2016-01-08T20:05:48Z
dc.date.available2016-01-08T20:05:48Z
dc.date.issued2009
dc.descriptionAnkara : The Department of Electrical and Electronics Engineering and the Institute of Engineering and Science of Bilkent University, 2009.en_US
dc.descriptionThesis (Ph. D.) -- Bilkent University, 2009.en_US
dc.descriptionIncludes bibliographical references leaves 111-124.en_US
dc.description.abstractIn this thesis, recognition and understanding of behavior based on visual inputs and automated decision schemes are investigated. Behavior analysis is carried out on a wide scope ranging from animal behavior to human behavior. Due to this extensive coverage, we present our work in two main parts. Part I of the thesis investigates locomotor behavior of lab animals with particular focus on drug screening experiments, and Part II investigates analysis of behavior in humans, with specific focus on visual attention. The animal behavior analysis method presented in Part I, is composed of motion tracking based on background subtraction, determination of discriminative behavioral characteristics from the extracted path and speed information, summarization of these characteristics in terms of feature vectors and classification of feature vectors. The experiments presented in Part I indicate that the proposed animal behavior analysis system proves very useful in behavioral and neuropharmacological studies as well as in drug screening and toxicology studies. This is due to the superior capability of the proposed method in detecting discriminative behavioral alterations in response to pharmacological manipulations. The human behavior analysis scheme presented in Part II proposes an efficient method to resolve attention fixation points in unconstrained settings adopting a developmental psychology point of view. The head of the experimenter is modeled as an elliptic cylinder. The head model is tracked using Lucas-Kanade optical flow method and the pose values are estimated accordingly. The resolved poses are then transformed into the gaze direction and the depth of the attended object through two Gaussian regressors. The regression outputs are superposed to find the initial estimates for object center locations. These estimates are pooled to mimic human saccades realistically and saliency is computed in the prospective region to determine the final estimates for attention fixation points. Verifying the extensive generalization capabilities of the human behavior analysis method given in Part II, we propose that rapid gaze estimation can be achieved for establishing joint attention in interaction-driven robot communication as well.en_US
dc.description.provenanceMade available in DSpace on 2016-01-08T20:05:48Z (GMT). No. of bitstreams: 1 0006943.pdf: 3783329 bytes, checksum: 405b26499f2ad07e0cb4d39c73c1929d (MD5)en
dc.description.statementofresponsibilityYücel, Zeynepen_US
dc.format.extentxvii, 124 leaves, illustrationsen_US
dc.identifier.urihttp://hdl.handle.net/11693/17049
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subject.lccTA1634 .Y83 2009en_US
dc.subject.lcshComputer vision.en_US
dc.titleComputer vision based behavior analysisen_US
dc.typeThesisen_US
thesis.degree.disciplineElectrical and Electronic Engineering
thesis.degree.grantorBilkent University
thesis.degree.levelDoctoral
thesis.degree.namePh.D. (Doctor of Philosophy)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
0006943.pdf
Size:
3.61 MB
Format:
Adobe Portable Document Format