Abstract
The ability for automated technologies to correctly identify a human’s actions provides considerable scope for systems that make use of human-machine interaction. Thus, automatic3D Human Action Recognition is an area that has seen significant research effort. In work described here, a human’s everyday 3D actions recorded in the NTU RGB+D dataset are identified using a novel structured-tree neural network. The nodes of the tree represent the skeleton joints, with the spine joint being represented by the root. The connection between a child node and its parent is known as the incoming edge while the reciprocal connection is known as the outgoing edge. The uses of tree structure lead to a system that intuitively maps to human movements. The classifier uses the change in displacement of joints and change in the angles between incoming and outgoing edges as features for classification of the actions performed.
Original language | English |
---|---|
Pages (from-to) | 849-854 |
Journal | European Journal of Engineering Research & Science |
Volume | 5 |
Issue number | 8 |
DOIs | |
Publication status | Published - 13 Aug 2020 |
Keywords
- Structure-Tree Neural Network (STNN)
- Skeleton, Human Action Recognition (HAR)