Activity recognition invariant to position and orientation of wearable motion sensor units
buir.advisor | Özaktaş, Billur Barshan | |
dc.contributor.author | Yurtman, Aras | |
dc.date.accessioned | 2019-04-30T12:20:44Z | |
dc.date.available | 2019-04-30T12:20:44Z | |
dc.date.copyright | 2019-04 | |
dc.date.issued | 2019-04 | |
dc.date.submitted | 2019-04-30 | |
dc.description | Cataloged from PDF version of article. | en_US |
dc.description | Thesis (Ph.D.): Bilkent University, Department of Electrical and Electronics Engineering, İhsan Doğramacı Bilkent University, 2019. | en_US |
dc.description | Includes bibliographical references (leaves 146-162). | en_US |
dc.description.abstract | We propose techniques that achieve invariance to the placement of wearable motion sensor units in the context of human activity recognition. First, we focus on invariance to sensor unit orientation and develop three alternative transformations to remove from the raw sensor data the effect of the orientation at which the sensor unit is placed. The first two orientation-invariant transformations rely on the geometry of the measurements, whereas the third is based on estimating the orientations of the sensor units with respect to the Earth frame by exploiting the physical properties of the sensory data. We test them with multiple state-of-the-art machine-learning classifiers using five publicly available datasets (when applicable) containing various types of activities acquired by different sensor configurations. We show that the proposed methods achieve a similar accuracy with the reference system where the units are correctly oriented, whereas the standard system cannot handle incorrectly oriented sensors. We also propose a novel non-iterative technique for estimating the orientations of the sensor units based on the physical and geometrical properties of the sensor data to improve the accuracy of the third orientation-invariant transformation. All of the three transformations can be integrated into the pre-processing stage of existing wearable systems without much effort since we do not make any assumptions about the sensor configuration, the body movements, and the classification methodology. Secondly, we develop techniques that achieve invariance to the positioning of the sensor units in three ways: (1) We propose transformations that are applied on the sensory data to allow each unit to be placed at any position within a pre-determined body part. (2) We propose a transformation technique to allow the units to be interchanged so that the user does not need to distinguish between them before positioning. (3) We employ three different techniques to classify the activities based on a single sensor unit, whereas the training set may contain data acquired by multiple units placed at different positions. We combine (1) with (2) and also with (3) to achieve further robustness to sensor unit positioning. We evaluate our techniques on a publicly available dataset using seven state-of-the-art classifiers and show that the reduction in the accuracy is acceptable, considering the exibility, convenience, and unobtrusiveness in the positioning of the units. Finally, we combine the position- and orientation-invariant techniques to simultaneously achieve both. The accuracy values are much higher than those of random decision making although some of them are significantly lower than the reference system with correctly placed units. The trade-off between the exibility in sensor unit placement and the classification accuracy indicates that different approaches may be suitable for different applications. | en_US |
dc.description.provenance | Submitted by Betül Özen (ozen@bilkent.edu.tr) on 2019-04-30T12:20:44Z No. of bitstreams: 1 Aras Yurtman - PhD Thesis.pdf: 25557287 bytes, checksum: 62b9e7d0f2a119818c9b51d56e42c961 (MD5) | en |
dc.description.provenance | Made available in DSpace on 2019-04-30T12:20:44Z (GMT). No. of bitstreams: 1 Aras Yurtman - PhD Thesis.pdf: 25557287 bytes, checksum: 62b9e7d0f2a119818c9b51d56e42c961 (MD5) Previous issue date: 2019-04 | en |
dc.description.statementofresponsibility | by Aras Yurtman | en_US |
dc.embargo.release | 2019-10-29 | |
dc.format.extent | xvii, 162 leaves : illustrations (some color), charts (some color) ; 30 cm. | en_US |
dc.identifier.itemid | B155726 | |
dc.identifier.uri | http://hdl.handle.net/11693/51042 | |
dc.language.iso | English | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Wearable sensing | en_US |
dc.subject | Human activity recognition | en_US |
dc.subject | Sensor placement | en_US |
dc.subject | Sensor position | en_US |
dc.subject | Sensor orientation | en_US |
dc.subject | Position-invariant sensing | en_US |
dc.subject | Orientation estimation | en_US |
dc.subject | Motion sensors | en_US |
dc.subject | Inertial sensors | en_US |
dc.subject | Accelerometer | en_US |
dc.subject | Gyroscope | en_US |
dc.subject | Magnetometer | en_US |
dc.title | Activity recognition invariant to position and orientation of wearable motion sensor units | en_US |
dc.title.alternative | Giyilebilir hareket algılayıcı ünitelerinin konum ve yönlerinden bağımsız olarak aktivite tanıma | en_US |
dc.type | Thesis | en_US |
thesis.degree.discipline | Electrical and Electronic Engineering | |
thesis.degree.grantor | Bilkent University | |
thesis.degree.level | Doctoral | |
thesis.degree.name | Ph.D. (Doctor of Philosophy) |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Aras Yurtman - PhD Thesis.pdf
- Size:
- 24.37 MB
- Format:
- Adobe Portable Document Format
- Description:
- Full printable version
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 1.71 KB
- Format:
- Item-specific license agreed upon to submission
- Description: