Eintrag weiter verarbeiten
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions
Gespeichert in:
Zeitschriftentitel: | Sensors |
---|---|
Personen und Körperschaften: | , , |
In: | Sensors, 18, 2018, 8, S. 2725 |
Format: | E-Article |
Sprache: | Englisch |
veröffentlicht: |
MDPI AG
|
Schlagwörter: |
author_facet |
Yurtman, Aras Barshan, Billur Fidan, Barış Yurtman, Aras Barshan, Billur Fidan, Barış |
---|---|
author |
Yurtman, Aras Barshan, Billur Fidan, Barış |
spellingShingle |
Yurtman, Aras Barshan, Billur Fidan, Barış Sensors Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions Electrical and Electronic Engineering Biochemistry Instrumentation Atomic and Molecular Physics, and Optics Analytical Chemistry |
author_sort |
yurtman, aras |
spelling |
Yurtman, Aras Barshan, Billur Fidan, Barış 1424-8220 MDPI AG Electrical and Electronic Engineering Biochemistry Instrumentation Atomic and Molecular Physics, and Optics Analytical Chemistry http://dx.doi.org/10.3390/s18082725 <jats:p>Wearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.</jats:p> Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions Sensors |
doi_str_mv |
10.3390/s18082725 |
facet_avail |
Online Free |
finc_class_facet |
Technik Mathematik Physik Chemie und Pharmazie Allgemeines |
format |
ElectronicArticle |
fullrecord |
blob:ai-49-aHR0cDovL2R4LmRvaS5vcmcvMTAuMzM5MC9zMTgwODI3MjU |
id |
ai-49-aHR0cDovL2R4LmRvaS5vcmcvMTAuMzM5MC9zMTgwODI3MjU |
institution |
DE-Brt1 DE-Zwi2 DE-D161 DE-Gla1 DE-Zi4 DE-15 DE-Pl11 DE-Rs1 DE-105 DE-14 DE-Ch1 DE-L229 DE-D275 DE-Bn3 |
imprint |
MDPI AG, 2018 |
imprint_str_mv |
MDPI AG, 2018 |
issn |
1424-8220 |
issn_str_mv |
1424-8220 |
language |
English |
mega_collection |
MDPI AG (CrossRef) |
match_str |
yurtman2018activityrecognitioninvarianttowearablesensorunitorientationusingdifferentialrotationaltransformationsrepresentedbyquaternions |
publishDateSort |
2018 |
publisher |
MDPI AG |
recordtype |
ai |
record_format |
ai |
series |
Sensors |
source_id |
49 |
title |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_unstemmed |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_full |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_fullStr |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_full_unstemmed |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_short |
Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_sort |
activity recognition invariant to wearable sensor unit orientation using differential rotational transformations represented by quaternions |
topic |
Electrical and Electronic Engineering Biochemistry Instrumentation Atomic and Molecular Physics, and Optics Analytical Chemistry |
url |
http://dx.doi.org/10.3390/s18082725 |
publishDate |
2018 |
physical |
2725 |
description |
<jats:p>Wearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.</jats:p> |
container_issue |
8 |
container_start_page |
0 |
container_title |
Sensors |
container_volume |
18 |
format_de105 |
Article, E-Article |
format_de14 |
Article, E-Article |
format_de15 |
Article, E-Article |
format_de520 |
Article, E-Article |
format_de540 |
Article, E-Article |
format_dech1 |
Article, E-Article |
format_ded117 |
Article, E-Article |
format_degla1 |
E-Article |
format_del152 |
Buch |
format_del189 |
Article, E-Article |
format_dezi4 |
Article |
format_dezwi2 |
Article, E-Article |
format_finc |
Article, E-Article |
format_nrw |
Article, E-Article |
_version_ |
1792346301865656322 |
geogr_code |
not assigned |
last_indexed |
2024-03-01T17:36:24.427Z |
geogr_code_person |
not assigned |
openURL |
url_ver=Z39.88-2004&ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fvufind.svn.sourceforge.net%3Agenerator&rft.title=Activity+Recognition+Invariant+to+Wearable+Sensor+Unit+Orientation+Using+Differential+Rotational+Transformations+Represented+by+Quaternions&rft.date=2018-08-19&genre=article&issn=1424-8220&volume=18&issue=8&pages=2725&jtitle=Sensors&atitle=Activity+Recognition+Invariant+to+Wearable+Sensor+Unit+Orientation+Using+Differential+Rotational+Transformations+Represented+by+Quaternions&aulast=Fidan&aufirst=Bar%C4%B1%C5%9F&rft_id=info%3Adoi%2F10.3390%2Fs18082725&rft.language%5B0%5D=eng |
SOLR | |
_version_ | 1792346301865656322 |
author | Yurtman, Aras, Barshan, Billur, Fidan, Barış |
author_facet | Yurtman, Aras, Barshan, Billur, Fidan, Barış, Yurtman, Aras, Barshan, Billur, Fidan, Barış |
author_sort | yurtman, aras |
container_issue | 8 |
container_start_page | 0 |
container_title | Sensors |
container_volume | 18 |
description | <jats:p>Wearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.</jats:p> |
doi_str_mv | 10.3390/s18082725 |
facet_avail | Online, Free |
finc_class_facet | Technik, Mathematik, Physik, Chemie und Pharmazie, Allgemeines |
format | ElectronicArticle |
format_de105 | Article, E-Article |
format_de14 | Article, E-Article |
format_de15 | Article, E-Article |
format_de520 | Article, E-Article |
format_de540 | Article, E-Article |
format_dech1 | Article, E-Article |
format_ded117 | Article, E-Article |
format_degla1 | E-Article |
format_del152 | Buch |
format_del189 | Article, E-Article |
format_dezi4 | Article |
format_dezwi2 | Article, E-Article |
format_finc | Article, E-Article |
format_nrw | Article, E-Article |
geogr_code | not assigned |
geogr_code_person | not assigned |
id | ai-49-aHR0cDovL2R4LmRvaS5vcmcvMTAuMzM5MC9zMTgwODI3MjU |
imprint | MDPI AG, 2018 |
imprint_str_mv | MDPI AG, 2018 |
institution | DE-Brt1, DE-Zwi2, DE-D161, DE-Gla1, DE-Zi4, DE-15, DE-Pl11, DE-Rs1, DE-105, DE-14, DE-Ch1, DE-L229, DE-D275, DE-Bn3 |
issn | 1424-8220 |
issn_str_mv | 1424-8220 |
language | English |
last_indexed | 2024-03-01T17:36:24.427Z |
match_str | yurtman2018activityrecognitioninvarianttowearablesensorunitorientationusingdifferentialrotationaltransformationsrepresentedbyquaternions |
mega_collection | MDPI AG (CrossRef) |
physical | 2725 |
publishDate | 2018 |
publishDateSort | 2018 |
publisher | MDPI AG |
record_format | ai |
recordtype | ai |
series | Sensors |
source_id | 49 |
spelling | Yurtman, Aras Barshan, Billur Fidan, Barış 1424-8220 MDPI AG Electrical and Electronic Engineering Biochemistry Instrumentation Atomic and Molecular Physics, and Optics Analytical Chemistry http://dx.doi.org/10.3390/s18082725 <jats:p>Wearable motion sensors are assumed to be correctly positioned and oriented in most of the existing studies. However, generic wireless sensor units, patient health and state monitoring sensors, and smart phones and watches that contain sensors can be differently oriented on the body. The vast majority of the existing algorithms are not robust against placing the sensor units at variable orientations. We propose a method that transforms the recorded motion sensor sequences invariantly to sensor unit orientation. The method is based on estimating the sensor unit orientation and representing the sensor data with respect to the Earth frame. We also calculate the sensor rotations between consecutive time samples and represent them by quaternions in the Earth frame. We incorporate our method in the pre-processing stage of the standard activity recognition scheme and provide a comparative evaluation with the existing methods based on seven state-of-the-art classifiers and a publicly available dataset. The standard system with fixed sensor unit orientations cannot handle incorrectly oriented sensors, resulting in an average accuracy reduction of 31.8%. Our method results in an accuracy drop of only 4.7% on average compared to the standard system, outperforming the existing approaches that cause an accuracy degradation between 8.4 and 18.8%. We also consider stationary and non-stationary activities separately and evaluate the performance of each method for these two groups of activities. All of the methods perform significantly better in distinguishing non-stationary activities, our method resulting in an accuracy drop of 2.1% in this case. Our method clearly surpasses the remaining methods in classifying stationary activities where some of the methods noticeably fail. The proposed method is applicable to a wide range of wearable systems to make them robust against variable sensor unit orientations by transforming the sensor data at the pre-processing stage.</jats:p> Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions Sensors |
spellingShingle | Yurtman, Aras, Barshan, Billur, Fidan, Barış, Sensors, Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions, Electrical and Electronic Engineering, Biochemistry, Instrumentation, Atomic and Molecular Physics, and Optics, Analytical Chemistry |
title | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_full | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_fullStr | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_full_unstemmed | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_short | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
title_sort | activity recognition invariant to wearable sensor unit orientation using differential rotational transformations represented by quaternions |
title_unstemmed | Activity Recognition Invariant to Wearable Sensor Unit Orientation Using Differential Rotational Transformations Represented by Quaternions |
topic | Electrical and Electronic Engineering, Biochemistry, Instrumentation, Atomic and Molecular Physics, and Optics, Analytical Chemistry |
url | http://dx.doi.org/10.3390/s18082725 |