Wearables-based multi-task gait and activity segmentation using recurrent neural networks

Martindale C, Christlein V, Klumpp P, Eskofier B (2021)


Publication Type: Journal article

Publication year: 2021

Journal

Book Volume: 432

Pages Range: 250-261

DOI: 10.1016/j.neucom.2020.08.079

Abstract

Human activity recognition (HAR) and cycle analysis, such as gait analysis, have become an integral part of daily lives from gesture recognition to step counting. As the available data and the possible application areas grow, an efficient solution without the need of handcrafted feature extraction is needed. We propose a multi-task recurrent neural network architecture that uses inertial sensor data to both segment and recognise activities and cycles. The solution is validated using three publicly available datasets consisting of more than 120 subjects and 8 activities, 6 of which are cyclic. Our architecture is smaller than comparable HAR models while being robust to different sensor placements and channels. Our proposed solution outperforms or defines state-of-the-art for HAR and cycle analysis using inertial sensors. We achieve an overall activity F1-score of 92.6 % and a phase detection F1-score of 98.2 %. The gait analysis achieves a mean stride time error of 5.3  51.9 ms and swing duration error of 0.0  5.9 %. The overall step count error for all activities is -1.5  2.8 %. Thus, we provide a method that is not dependent on feature extraction and a model that is sensor and location independent.

Authors with CRIS profile

Additional Organisation(s)

How to cite

APA:

Martindale, C., Christlein, V., Klumpp, P., & Eskofier, B. (2021). Wearables-based multi-task gait and activity segmentation using recurrent neural networks. Neurocomputing, 432, 250-261. https://dx.doi.org/10.1016/j.neucom.2020.08.079

MLA:

Martindale, Christine, et al. "Wearables-based multi-task gait and activity segmentation using recurrent neural networks." Neurocomputing 432 (2021): 250-261.

BibTeX: Download