Paper
31 January 2020 PTLHAR: PoseNet and transfer learning for human activities recognition based on body articulations
Nozha Jlidi, Ahmed Snoun, Tahani Bouchrika, Olfa Jemai, Mourad Zaied
Author Affiliations +
Proceedings Volume 11433, Twelfth International Conference on Machine Vision (ICMV 2019); 114330Q (2020) https://doi.org/10.1117/12.2559567
Event: Twelfth International Conference on Machine Vision, 2019, Amsterdam, Netherlands
Abstract
This paper introduces a novel approach for human activities recognition (HAR) based on body articulations (joints) that represent the connection between bones in the human body which join the skeletal system such as the knee, shoulder and hand, and which are made to allow different degrees and types of movement. To implement our system, we used PoseNet to extract articulation points, which will be classified employing transfer learning approach to recognize the activity. The created system will be named in the rest of the paper (PTLHAR). The experimental results show that the proposed approach provides a significant improvement over state-of-the-art methods.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Nozha Jlidi, Ahmed Snoun, Tahani Bouchrika, Olfa Jemai, and Mourad Zaied "PTLHAR: PoseNet and transfer learning for human activities recognition based on body articulations", Proc. SPIE 11433, Twelfth International Conference on Machine Vision (ICMV 2019), 114330Q (31 January 2020); https://doi.org/10.1117/12.2559567
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Video

RGB color model

Feature extraction

Sensors

Bone

Data modeling

Lithium

RELATED CONTENT

A review of action recognition methods based on skeleton data
Proceedings of SPIE (February 16 2022)

Back to Top