Directional PointNet: 3D Environmental Classification for Wearable Robots
PDF

Keywords

PointNet
3D environmental classification
point cloud
wearable robots

How to Cite

ZHANG, K., WANG, J., & FU, C. (2019). Directional PointNet: 3D Environmental Classification for Wearable Robots. Instrumentation, 6(1). Retrieved from https://instrumentationjournal.com/index.php/instr/article/view/71

Abstract

A subject who wears a suitable robotic device will be able to walk in complex environments with the aid of environmental recognition schemes that provide reliable prior information of the human motion intent. Researchers have utilized 1D laser signals and 2D depth images to classify environments, but those approaches can face the problems of self-occlusion. In comparison, 3D point cloud is more appropriate for depicting the environments. This paper proposes a directional PointNet to directly classify the 3D point cloud. First, an inertial measurement unit (IMU) is used to offset the orientation of point cloud. Then the directional PointNet can accurately classify the daily commuted terrains, including level ground, climbing up stairways, and walking down stairs. A classification accuracy of 98% has been achieved in tests. Moreover, the directional PointNet is more efficient than the previously used PointNet because the T-net, which is utilized to estimate the transformation of the point cloud, is not used in the present approach, and the length of the global feature is optimized. The experimental results demonstrate that the directional PointNet can classify the environments in robust and efficient manner.

PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Downloads

Download data is not yet available.