Human Action Recognition Using Deep Learning Methods on Limited Sensory Data

Nilay Tufek, Murat Yalcin, Mucahit Altintas, Fatma Kalaoglu, Yi Li, Senem Kursun Bahadir*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

75 Citations (Scopus)

Abstract

In recent years, due to the widespread usage of various sensors action recognition is becoming more popular in many fields such as person surveillance, human-robot interaction etc. In this study, we aimed to develop an action recognition system by using only limited accelerometer and gyroscope data. Several deep learning methods like Convolutional Neural Network(CNN), Long-Short Term Memory (LSTM) with classical machine learning algorithms and their combinations were implemented and a performance analysis was carried out. Data balancing and data augmentation methods were applied and accuracy rates were increased noticeably. We achieved new state-of-the-art result on the UCI HAR dataset by 97.4% accuracy rate with using 3 layer LSTM model. Also, we implemented same model on collected dataset (ETEXWELD) and 99.0% accuracy rate was obtained which means a solid contribution. Moreover, the performance analysis is not only based on accuracy results, but also includes precision, recall and f1-score metrics. Additionally, a real-time application was developed by using 3 layer LSTM network for evaluating how the best model classifies activities robustly.

Original languageEnglish
Article number8918509
Pages (from-to)3101-3112
Number of pages12
JournalIEEE Sensors Journal
Volume20
Issue number6
DOIs
Publication statusPublished - 15 Mar 2020

Bibliographical note

Publisher Copyright:
© 2001-2012 IEEE.

Funding

Manuscript received October 12, 2019; accepted November 13, 2019. Date of publication December 2, 2019; date of current version February 14, 2020. This work has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 644268. Welding of E-Textiles for Interactive Clothing (ETEXWELD) Project. The associate editor coordinating the review of this article and approving it for publication was Prof. Chang-Hee Won. (Corresponding author: Senem Kursun Bahadir.) N.Tufek,M.Yalcinand M.AltintasarewiththeDepartmentofComputer Engineering, Istanbul Technical University (ITU), 34467 Istanbul, Turkey (e-mail: ntufek@gmail.com; yalcinmur@itu.edu.tr; maltintas@itu.edu.tr).

FundersFunder number
Marie Skłodowska-Curie
Horizon 2020 Framework Programme644268

    Keywords

    • Activity recognition
    • CNN
    • LSTM
    • data augmentation
    • data balancing
    • deep learning

    Fingerprint

    Dive into the research topics of 'Human Action Recognition Using Deep Learning Methods on Limited Sensory Data'. Together they form a unique fingerprint.

    Cite this