Activity analysis for assistive systems
buir.advisor | Şahin, Pınar Duygulu | |
dc.contributor.author | İşcen, Ahmet | |
dc.date.accessioned | 2016-01-08T18:28:07Z | |
dc.date.available | 2016-01-08T18:28:07Z | |
dc.date.issued | 2014 | |
dc.description | Cataloged from PDF version of article. | en_US |
dc.description | Includes bibliographical references leaves 47-51. | en_US |
dc.description.abstract | Although understanding and analyzing human actions is a popular research topic in computer vision, most of the research has focused on recognizing ”ordinary” actions, such as walking and jumping. Extending these methods for more specific domains, such as assistive technologies, is not a trivial task. In most cases, these applications contain more fine-grained activities with low inter-class variance and high intra-class variance. In this thesis, we propose to use motion information from snippets, or small video intervals, in order to recognize actions from daily activities. Proposed method encodes the motion by considering the motion statistics, such as the variance and the length of trajectories. It also encodes the position information by using a spatial grid. We show that such approach is especially helpful for the domain of medical device usage, which contains actions with fast movements Another contribution that we propose is to model the sequential information of actions by the order in which they occur. This is especially useful for fine-grained activities, such as cooking activities, where the visual information may not be enough to distinguish between different actions. As for the visual perspective of the problem, we propose to combine multiple visual descriptors by weighing their confidence values. Our experiments show that, temporal sequence model and the fusion of multiple descriptors significantly improve the performance when used together. | en_US |
dc.description.statementofresponsibility | İşcen, Ahmet | en_US |
dc.format.extent | xi, 51 leaves, illustrations | en_US |
dc.identifier.itemid | B147911 | |
dc.identifier.uri | http://hdl.handle.net/11693/15983 | |
dc.language.iso | English | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Assistive | en_US |
dc.subject | Living | en_US |
dc.subject | Systems | en_US |
dc.subject | Action | en_US |
dc.subject | Activity | en_US |
dc.subject | Recognition | en_US |
dc.subject.lcc | TK7882.P7 I83 2014 | en_US |
dc.subject.lcsh | Human activity recognition. | en_US |
dc.subject.lcsh | Image analysis. | en_US |
dc.subject.lcsh | Image processing. | en_US |
dc.subject.lcsh | Image processing--Digital techniques. | en_US |
dc.title | Activity analysis for assistive systems | en_US |
dc.type | Thesis | en_US |
thesis.degree.discipline | Computer Engineering | |
thesis.degree.grantor | Bilkent University | |
thesis.degree.level | Master's | |
thesis.degree.name | MS (Master of Science) |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- 0006687.pdf
- Size:
- 8.05 MB
- Format:
- Adobe Portable Document Format
- Description:
- Full printable version