...
首页> 外文期刊>Computing >Multi-input CNN-GRU based human activity recognition using wearable sensors
【24h】

Multi-input CNN-GRU based human activity recognition using wearable sensors

机译:使用可穿戴传感器的多输入CNN-GRU的人类活动识别

获取原文
获取原文并翻译 | 示例
           

摘要

Human Activity Recognition (HAR) has attracted much attention from researchers in the recent past. The intensification of research into HAR lies in the motive to understand human behaviour and inherently anticipate human intentions. Human activity data obtained via wearable sensors like gyroscope and accelerometer is in the form of time series data, as each reading has a timestamp associated with it. For HAR, it is important to extract the relevant temporal features from raw sensor data. Most of the approaches for HAR involves a good amount of feature engineering and data pre-processing, which in turn requires domain expertise. Such approaches are time-consuming and are application-specific. In this work, a Deep Neural Network based model, which uses Convolutional Neural Network, and Gated Recurrent Unit is proposed as an end-to-end model performing automatic feature extraction and classification of the activities as well. The experiments in this work were carried out using the raw data obtained from wearable sensors with nominal pre-processing and don't involve any handcrafted feature extraction techniques. The accuracies obtained on UCI-HAR, WISDM, and PAMAP2 datasets are 96.20%, 97.21%, and 95.27% respectively. The results of the experiments establish that the proposed model achieved superior classification performance than other similar architectures.
机译:人类活动识别(HAR)在最近的研究人员中引起了很多关注。对哈尔的研究加剧了解人类行为和本质地预期人类意图的动机。通过陀螺仪和加速度计等可穿戴传感器获得的人类活动数据是时间序列数据的形式,因为每个读数都有一个与之相关的时间戳。对于Har,重要的是从原始传感器数据中提取相关的时间特征。哈尔的大多数方法都涉及良好的特征工程和数据预处理,反过来需要域专业知识。这种方法是耗时的,并且是特定于应用的。在这项工作中,提出了一种使用卷积神经网络的基于深度神经网络的模型,以及门控复发单元作为执行自动特征提取和活动分类的端到端模型。使用从具有标称预处理的可穿戴传感器获得的原始数据进行本工作的实验,并不涉及任何手动特征提取技术。在UCI-HAR,WISDM和PAMAP2数据集上获得的准确性分别为96.20%,97.21%和95.27%。实验结果确定所提出的模型比其他类似架构的卓越分类性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号