Activity recognition invariant to position and orientation of wearable motion sensor units

buir.advisorÖzaktaş, Billur Barshan
dc.contributor.authorYurtman, Aras
dc.date.accessioned2019-04-30T12:20:44Z
dc.date.available2019-04-30T12:20:44Z
dc.date.copyright2019-04
dc.date.issued2019-04
dc.date.submitted2019-04-30
dc.descriptionCataloged from PDF version of article.en_US
dc.descriptionThesis (Ph.D.): Bilkent University, Department of Electrical and Electronics Engineering, İhsan Doğramacı Bilkent University, 2019.en_US
dc.descriptionIncludes bibliographical references (leaves 146-162).en_US
dc.description.abstractWe propose techniques that achieve invariance to the placement of wearable motion sensor units in the context of human activity recognition. First, we focus on invariance to sensor unit orientation and develop three alternative transformations to remove from the raw sensor data the effect of the orientation at which the sensor unit is placed. The first two orientation-invariant transformations rely on the geometry of the measurements, whereas the third is based on estimating the orientations of the sensor units with respect to the Earth frame by exploiting the physical properties of the sensory data. We test them with multiple state-of-the-art machine-learning classifiers using five publicly available datasets (when applicable) containing various types of activities acquired by different sensor configurations. We show that the proposed methods achieve a similar accuracy with the reference system where the units are correctly oriented, whereas the standard system cannot handle incorrectly oriented sensors. We also propose a novel non-iterative technique for estimating the orientations of the sensor units based on the physical and geometrical properties of the sensor data to improve the accuracy of the third orientation-invariant transformation. All of the three transformations can be integrated into the pre-processing stage of existing wearable systems without much effort since we do not make any assumptions about the sensor configuration, the body movements, and the classification methodology. Secondly, we develop techniques that achieve invariance to the positioning of the sensor units in three ways: (1) We propose transformations that are applied on the sensory data to allow each unit to be placed at any position within a pre-determined body part. (2) We propose a transformation technique to allow the units to be interchanged so that the user does not need to distinguish between them before positioning. (3) We employ three different techniques to classify the activities based on a single sensor unit, whereas the training set may contain data acquired by multiple units placed at different positions. We combine (1) with (2) and also with (3) to achieve further robustness to sensor unit positioning. We evaluate our techniques on a publicly available dataset using seven state-of-the-art classifiers and show that the reduction in the accuracy is acceptable, considering the exibility, convenience, and unobtrusiveness in the positioning of the units. Finally, we combine the position- and orientation-invariant techniques to simultaneously achieve both. The accuracy values are much higher than those of random decision making although some of them are significantly lower than the reference system with correctly placed units. The trade-off between the exibility in sensor unit placement and the classification accuracy indicates that different approaches may be suitable for different applications.en_US
dc.description.provenanceSubmitted by Betül Özen (ozen@bilkent.edu.tr) on 2019-04-30T12:20:44Z No. of bitstreams: 1 Aras Yurtman - PhD Thesis.pdf: 25557287 bytes, checksum: 62b9e7d0f2a119818c9b51d56e42c961 (MD5)en
dc.description.provenanceMade available in DSpace on 2019-04-30T12:20:44Z (GMT). No. of bitstreams: 1 Aras Yurtman - PhD Thesis.pdf: 25557287 bytes, checksum: 62b9e7d0f2a119818c9b51d56e42c961 (MD5) Previous issue date: 2019-04en
dc.description.statementofresponsibilityby Aras Yurtmanen_US
dc.embargo.release2019-10-29
dc.format.extentxvii, 162 leaves : illustrations (some color), charts (some color) ; 30 cm.en_US
dc.identifier.itemidB155726
dc.identifier.urihttp://hdl.handle.net/11693/51042
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectWearable sensingen_US
dc.subjectHuman activity recognitionen_US
dc.subjectSensor placementen_US
dc.subjectSensor positionen_US
dc.subjectSensor orientationen_US
dc.subjectPosition-invariant sensingen_US
dc.subjectOrientation estimationen_US
dc.subjectMotion sensorsen_US
dc.subjectInertial sensorsen_US
dc.subjectAccelerometeren_US
dc.subjectGyroscopeen_US
dc.subjectMagnetometeren_US
dc.titleActivity recognition invariant to position and orientation of wearable motion sensor unitsen_US
dc.title.alternativeGiyilebilir hareket algılayıcı ünitelerinin konum ve yönlerinden bağımsız olarak aktivite tanımaen_US
dc.typeThesisen_US
thesis.degree.disciplineElectrical and Electronic Engineering
thesis.degree.grantorBilkent University
thesis.degree.levelDoctoral
thesis.degree.namePh.D. (Doctor of Philosophy)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Aras Yurtman - PhD Thesis.pdf
Size:
24.37 MB
Format:
Adobe Portable Document Format
Description:
Full printable version

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: