Activity recognition invariant towearable sensor unit orientation using differential rotational transformations represented by quaternions

buir.contributor.authorYurtman, Aras
buir.contributor.authorBarshan, Billur
dc.citation.epage2725-27en_US
dc.citation.issueNumber8en_US
dc.citation.spage2725-1en_US
dc.citation.volumeNumber18en_US
dc.contributor.authorYurtman, Arasen_US
dc.contributor.authorBarshan, Billuren_US
dc.contributor.authorFidan B.en_US
dc.date.accessioned2019-02-21T16:08:56Z
dc.date.available2019-02-21T16:08:56Z
dc.date.issued2018en_US
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.description.abstractWearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.
dc.description.provenanceMade available in DSpace on 2019-02-21T16:08:56Z (GMT). No. of bitstreams: 1 Bilkent-research-paper.pdf: 222869 bytes, checksum: 842af2b9bd649e7f548593affdbafbb3 (MD5) Previous issue date: 2018en
dc.identifier.doi10.3390/s18082725
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/11693/50438
dc.language.isoEnglish
dc.publisherMDPI AG
dc.relation.isversionofhttps://doi.org/10.3390/s18082725
dc.rightsinfo:eu-repo/semantics/openAccess
dc.source.titleSensors (Switzerland)en_US
dc.subjectAccelerometeren_US
dc.subjectActivity recognition and monitoringen_US
dc.subjectGyroscopeen_US
dc.subjectMagnetometeren_US
dc.subjectMotion sensorsen_US
dc.subjectOrientation-invariant sensingen_US
dc.subjectPatient health and state monitoringen_US
dc.subjectPattern classificationen_US
dc.subjectWearable sensingen_US
dc.titleActivity recognition invariant towearable sensor unit orientation using differential rotational transformations represented by quaternionsen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Activity-Recognition-Invariant-to-Wearable-Sensor-Unit-Orientation-Using-Differential-Rotational-Transformations-Represented-by-Quaternions.pdf
Size:
12.32 MB
Format:
Adobe Portable Document Format
Description:
Full printable version