Browsing by Author "Yurtman, Aras"
Now showing 1 - 11 of 11
Results Per Page
Sort Options
Item Open Access Activity recognition invariant to position and orientation of wearable motion sensor units(Bilkent University, 2019-04) Yurtman, ArasWe propose techniques that achieve invariance to the placement of wearable motion sensor units in the context of human activity recognition. First, we focus on invariance to sensor unit orientation and develop three alternative transformations to remove from the raw sensor data the effect of the orientation at which the sensor unit is placed. The first two orientation-invariant transformations rely on the geometry of the measurements, whereas the third is based on estimating the orientations of the sensor units with respect to the Earth frame by exploiting the physical properties of the sensory data. We test them with multiple state-of-the-art machine-learning classifiers using five publicly available datasets (when applicable) containing various types of activities acquired by different sensor configurations. We show that the proposed methods achieve a similar accuracy with the reference system where the units are correctly oriented, whereas the standard system cannot handle incorrectly oriented sensors. We also propose a novel non-iterative technique for estimating the orientations of the sensor units based on the physical and geometrical properties of the sensor data to improve the accuracy of the third orientation-invariant transformation. All of the three transformations can be integrated into the pre-processing stage of existing wearable systems without much effort since we do not make any assumptions about the sensor configuration, the body movements, and the classification methodology. Secondly, we develop techniques that achieve invariance to the positioning of the sensor units in three ways: (1) We propose transformations that are applied on the sensory data to allow each unit to be placed at any position within a pre-determined body part. (2) We propose a transformation technique to allow the units to be interchanged so that the user does not need to distinguish between them before positioning. (3) We employ three different techniques to classify the activities based on a single sensor unit, whereas the training set may contain data acquired by multiple units placed at different positions. We combine (1) with (2) and also with (3) to achieve further robustness to sensor unit positioning. We evaluate our techniques on a publicly available dataset using seven state-of-the-art classifiers and show that the reduction in the accuracy is acceptable, considering the exibility, convenience, and unobtrusiveness in the positioning of the units. Finally, we combine the position- and orientation-invariant techniques to simultaneously achieve both. The accuracy values are much higher than those of random decision making although some of them are significantly lower than the reference system with correctly placed units. The trade-off between the exibility in sensor unit placement and the classification accuracy indicates that different approaches may be suitable for different applications.Item Open Access Activity recognition invariant towearable sensor unit orientation using differential rotational transformations represented by quaternions(MDPI AG, 2018) Yurtman, Aras; Barshan, Billur; Fidan B.Wearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.Item Open Access Classifying daily and sports activities invariantly to the positioning of wearable motion sensor units(IEEE, 2020) Barshan, Billur; Yurtman, ArasWe propose techniques that achieve invariance to the positioning of wearable motion sensor units on the body for the recognition of daily and sports activities. Using two sequence sets based on the sensory data allows each unit to be placed at any position on a given rigid body part. As the unit is shifted from its ideal position with larger displacements, the activity recognition accuracy of the system that uses these sequence sets degrades slowly, whereas that of the reference system (which is not designed to achieve position invariance) drops very fast. Thus, we observe a tradeoff between the flexibility in sensor unit positioning and the classification accuracy. The reduction in the accuracy is at acceptable levels, considering the convenience and flexibility provided to the user in the placement of the units. We compare the proposed approach with an existing technique to achieve position invariance and combine the former with our earlier methodology to achieve orientation invariance. We evaluate our proposed methodology on a publicly available data set of daily and sports activities acquired by wearable motion sensor units. The proposed representations can be integrated into the preprocessing stage of existing wearable systems without significant effort.Item Open Access Detection and evaluation of physical therapy exercises by dynamic time warping using wearable motion sensor units(Springer, 2014) Yurtman, Aras; Barshan, BillurWe develop an autonomous system that detects and evaluates physical therapy exercises using wearable motion sensors. We propose an algorithm that detects all the occurrences of one or more template signals (representing exercise movements) in a long signal acquired during a physical therapy session. In matching the signals, the algorithm allows some distortion in time, based on dynamic time warping (DTW). The algorithm classifies the executions in one of the exercises and evaluates them as correct/incorrect, giving the error type if there is any. It also provides a quantitative measure of similarity between each matched execution and its template. To evaluate the performance of the algorithm in physical therapy, a dataset consisting of one template execution and ten test executions of each of the three execution types of eight exercises performed by five subjects is recorded, having a total of 120 and 1,200 exercise executions in the training and test sets, respectively, as well as many idle time intervals in the test signals. The proposed algorithm detects 1,125 executions in the whole test set. 8.58 % of the 1,200 executions are missed and 4.91 % of the idle time intervals are incorrectly detected as executions. The accuracy is 93.46 % only for exercise classification and 88.65 % for simultaneous exercise and execution type classification. The proposed system may be used for both estimating the intensity of the physical therapy session and evaluating the executions to provide feedback to the patient and the specialist.Item Open Access Fizik tedavi egzersizlerinin giyilebilir hareket algılayıcıları işaretlerinden dinamik zaman bükmesiyle sezimi ve değerlendirilmesi(IEEE, 2014-04) Yurtman, Aras; Barshan, BillurGiyilebilir hareket algılayıcılarından kaydedilen sinyalleri işleyerek fizik tedavi egzersizlerini algılamak ve değerlendirmek için özerk bir sistem geliştirilmiştir. Bir fizik tedavi seansındaki bir ya da birden fazla egzersiz tipini algılamak için, temeli dinamik zaman bükmesi (DZB) benzeşmezlik ölçütüne dayanan bir algoritma geliştirilmiştir. Algoritma, egzersizlerin doğru ya da yanlış yapıldığını değerlendirmekte ve varsa hata türünü saptamaktadır. Algoritmanın başarımını degerlendirmek için, beş katılımcı tarafından yapılan sekiz egzersiz hareketinin üç yürütüm türü için birer şablon ve 10’ar sınama yürütümünden oluşan bir veri kümesi kaydedilmiştir. Dolayısıyla, eğitim ve sınama kümelerinde sırasıyla 120 ve 1,200 egzersiz yürütümü bulunmaktadır. Sınama kümesi, boş zaman dilimleri de içermektedir. Öne sürülen algoritma, sınama kümesindeki 1,200 yürütümün % 8.58’ini kaçırmakta ve boş zaman dilimlerinin % 4.91’ini yanlış sezim olarak değerlendirerek toplam 1,125 yürütüm algılamaktadır. Doğruluk, sadece egzersiz sınıflandırması ele alındığında ˘ % 93.46, hem egzersiz hem de yürütüm türü sınıflandırması içinse % 88.65’tir. Sistemin bilinmeyen egzersizlere karşı davranışını sınamak için, algoritma, her egzersiz için, o egzersizin şablonları dışarıda bırakılarak çalıştırılmış ve 1,200 egzersizin sadece 10’u yanlış sezilmiştir. Bu sonuç, sistemin bilinmeyen hareketlere karşı gürbüz olduğunu göstermektedir. Öne sürülen sistem, hem bir fizik tedavi seansının yoğunluğunu kestirmek, hem de hastaya ve fizik tedavi uzmanına geribildirim vermek amacıyla egzersiz hareketlerini değerlendirmek için kullanılabilir.Item Open Access Human activity recognition using tag-based localization(IEEE, 2012-04) Yurtman, Aras; Barshan, BarshanThis paper provides a comparative study on the different techniques of classifying human activities using a tag-based radio-frequency (RF) localization system. Non-uniformly-sampled data containing position measurements of the tags on the body is first converted to a uniformly-sampled one using different curve-fitting algorithms. Then, the data is partitioned into segments. Finally, various classification techniques are applied to classify human activities. Curve-fitting, segmentation, and classification methods are compared using different cross-validation techniques and the combination resulting in the best performance is presented. The results indicate that the system demonstrates acceptable performance despite the fact that tag-based RF localization is not very accurate.Item Open Access Inter- and intra-subject variations in activity recognition using inertial sensors and magnetometers(2012-02) Yurtman, Aras; Barshan, BillurItem Open Access Investigation of personal variations in activity recognition using miniature inertial sensors and magnetometers(IEEE, 2012-04) Yurtman, Aras; Barshan, BillurIn this paper, data acquired from five sensory units mounted on the human body, each containing a tri-axial accelerometer, gyroscope, and magnetometer, during 19 different human activities is used to calculate inter-subject and inter-activity variations using different methods and the results are summarized in various forms. Absolute, Euclidean, and dynamic time-warping distances are used to assess the similarity of the signals. The comparisons are made using the raw and normalized time-domain data, raw and normalized feature vectors. Firstly, inter-subject distances are averaged out per activity and per subject. Based on these values, the "best" subject is defined and identified according to his/her average distance to the others. Then, the averages and standard deviations of inter-activity distances are presented per subject, per unit, and per sensor. Moreover, the effects of removing the mean and the different distance measures on the results are discussed. © 2012 IEEE.Item Open Access Novel noniterative orientation estimation for wearable motion sensor units acquiring accelerometer, gyroscope, and magnetometer measurements(IEEE, 2020) Yurtman, Aras; Barshan, BillurWe propose a novel noniterative orientation estimation method based on the physical and geometrical properties of the acceleration, angular rate, and magnetic field vectors to estimate the orientation of motion sensor units. The proposed algorithm aims that the vertical (up) axis of the earth coordinate frame is as close as possible to the measured acceleration vector and that the north axis of the earth makes an angle with the detected magnetic field vector as close as possible to the estimated value of the magnetic dip angle. We obtain the sensor unit orientation based on the rotational quaternion transformation between the earth and the sensor unit frames. We evaluate the proposed method by incorporating it into an activity recognition scheme for daily and sports activities, which requires accurately estimated sensor unit orientations to achieve invariance to the orientations at which the units are worn on the body. Using four different classifiers on a publicly available data set, the proposed methodology achieves an average activity recognition accuracy higher than the state-of-the-art methods, as well as being computationally efficient enough to be executed in real time.Item Open Access Position invariance for wearables: interchangeability and single-unit usage via machine learning(IEEE, 2021) Yurtman, Aras; Barshan, Billur; Redif, S.We propose a new methodology to attain invariance to the positioning of body-worn motion-sensor units for recognizing everyday and sports activities. We first consider random interchangeability of the sensor units so that the user does not need to distinguish between them before wearing. To this end, we propose to use the compact singular value decomposition (SVD) that significantly reduces the accuracy degradation caused by random interchanging of the units. Secondly, we employ three variants of a generalized classifier that requires wearing only a single sensor unit on any one of the body parts to classify the activities. We combine both approaches with our previously developed methods to achieve invariance to both position and orientation, which ultimately allows the user significant flexibility in sensor-unit placement (position and orientation). We assess the performance of our proposed approach on a publicly available activity dataset recorded by body-worn motion-sensor units. Experimental results suggest that there is a tolerable reduction in accuracy, which is justified by the significant flexibility and convenience offered to users when placing the units.Item Open Access Recognition and classification of human activities using wearable sensors(Bilkent University, 2012) Yurtman, ArasWe address the problem of detecting and classifying human activities using two different types of wearable sensors. In the first part of the thesis, a comparative study on the different techniques of classifying human activities using tag-based radio-frequency (RF) localization is provided. Position data of multiple RF tags worn on the human body are acquired asynchronously and non-uniformly. Curves fitted to the data are re-sampled uniformly and then segmented. The effect of varying the relevant system parameters on the system accuracy is investigated. Various curve-fitting, segmentation, and classification techniques are compared and the combination resulting in the best performance is presented. The classifiers are validated through the use of two different cross-validation methods. For the complete classification problem with 11 classes, the proposed system demonstrates an average classification error of 8.67% and 21.30% for 5-fold and subject-based leave-one-out (L1O) cross validation, respectively. When the number of classes is reduced to five by omitting the transition classes, these errors become 1.12% and 6.52%. The system demonstrates acceptable classification performance despite that tag-based RF localization does not provide very accurate position measurements. In the second part, data acquired from five sensory units worn on the human body, each containing a tri-axial accelerometer, a gyroscope, and a magnetometer, during 19 different human activities are used to calculate inter-subject and interactivity variations in the data with different methods. Absolute, Euclidean, and dynamic time-warping (DTW) distances are used to assess the similarity of the signals. The comparisons are made using time-domain data and feature vectors. Different normalization methods are used and compared. The “best” subject is defined and identified according to his/her average distance to the other subjects.Based on one of the similarity criteria proposed here, an autonomous system that detects and evaluates physical therapy exercises using inertial sensors and magnetometers is developed. An algorithm that detects all the occurrences of one or more template signals (exercise movements) in a long signal (physical therapy session) while allowing some distortion is proposed based on DTW. The algorithm classifies the executions in one of the exercises and evaluates them as correct/incorrect, identifying the error type if there is any. To evaluate the performance of the algorithm in physical therapy, a dataset consisting of one template execution and ten test executions of each of the three execution types of eight exercise movements performed by five subjects is recorded, having totally 120 and 1,200 exercise executions in the training and test sets, respectively, as well as many idle time intervals in the test signals. The proposed algorithm detects 1,125 executions in the whole test set. 8.58% of the executions are missed and 4.91% of the idle intervals are incorrectly detected as an execution. The accuracy is 93.46% for exercise classification and 88.65% for both exercise and execution type classification. The proposed system may be used to both estimate the intensity of the physical therapy session and evaluate the executions to provide feedback to the patient and the specialist.