Investigating inter-subject and inter-activity variations in activity recognition using wearable motion sensors

dc.citation.epage1362en_US
dc.citation.issueNumber9en_US
dc.citation.spage1345en_US
dc.citation.volumeNumber59en_US
dc.contributor.authorBarshan, B.en_US
dc.contributor.authorYurtman, A.en_US
dc.date.accessioned2018-04-12T10:44:33Z
dc.date.available2018-04-12T10:44:33Z
dc.date.issued2016en_US
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.description.abstractThis work investigates inter-subject and inter-activity variability of a given activity dataset and provides some new definitions to quantify such variability. The definitions are sufficiently general and can be applied to a broad class of datasets that involve time sequences or features acquired using wearable sensors. The study is motivated by contradictory statements in the literature on the need for user-specific training in activity recognition. We employ our publicly available dataset that contains 19 daily and sports activities acquired from eight participants who wear five motion sensor units each. We pre-process recorded activity time sequences in three different ways and employ absolute, Euclidean and dynamic time warping distance measures to quantify the similarity of the recorded signal patterns. We define and calculate the average inter-subject and inter-activity distances with various methods based on the raw and pre-processed time-domain data as well as on the raw and pre-processed feature vectors. These definitions allow us to identify the subject who performs the activities in the most representative way and pinpoint the activities that show more variation among the subjects. We observe that the type of pre-processing used affects the results of the comparisons but that the different distance measures do not alter the comparison results as much. We check the consistency of our analysis and results by highlighting some of our activity recognition rates based on an exhaustive set of sensor unit, sensor type and subject combinations. We expect the results to be useful for dynamic sensor unit/type selection, for deciding whether to perform user-specific training and for designing more effective classifiers in activity recognition.en_US
dc.description.provenanceMade available in DSpace on 2018-04-12T10:44:33Z (GMT). No. of bitstreams: 1 bilkent-research-paper.pdf: 179475 bytes, checksum: ea0bedeb05ac9ccfb983c327e155f0c2 (MD5) Previous issue date: 2016en
dc.identifier.doi10.1093/comjnl/bxv093en_US
dc.identifier.issn0010-4620
dc.identifier.urihttp://hdl.handle.net/11693/36569
dc.language.isoEnglishen_US
dc.publisherOxford University Pressen_US
dc.relation.isversionofhttp://dx.doi.org/10.1093/comjnl/bxv093en_US
dc.source.titleComputer Journalen_US
dc.subjectAccelerometeren_US
dc.subjectActivity recognition and classificationen_US
dc.subjectDynamic time warpingen_US
dc.subjectFeature extractionen_US
dc.subjectFeature reductionen_US
dc.subjectGyroscopeen_US
dc.subjectInertial sensorsen_US
dc.subjectInter-activity variationen_US
dc.subjectInter-subject variationen_US
dc.subjectMagnetometersen_US
dc.subjectMotion captureen_US
dc.subjectMotion sensorsen_US
dc.subjectWearable sensingen_US
dc.titleInvestigating inter-subject and inter-activity variations in activity recognition using wearable motion sensorsen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Investigating Inter-Subject and Inter-Activity Variations in Activity Recognition Using Wearable Motion Sensors.pdf
Size:
384.13 KB
Format:
Adobe Portable Document Format
Description:
Full printable version