In the last few years power wheelchairs have been becoming the only device able to provide autonomy and independence to people with motor skill impairments. In particular, many power wheelchairs feature robotic arms for gesture emulation, like the interaction with objects. However, complex robotic arms often require a
joystick to be controlled; this feature make the arm hard to be controlled by impaired users. Paradoxically, if the user was able to proficiently control such devices, he would not need them. For that reason, this paper presents a highly autonomous robotic arm, designed in order to minimize the effort necessary for its control.
In order to do that, the arm features an easy to use human - machine interface and is controlled by Computer Vison algorithm, implementing a Position Based Visual Servoing (PBVS) control. It was realized by extracting features from the images captured by the camera and fusing them with the distance from the target, obtained by a proximity sensor. The Parallel Tracking and Mapping (PTAM) algorithm was used to find the 3D position of the task object in the camera reference system. The visual servoing algorithm was implemented in an embedded platform, in real time. Each part of the control loop was developed in Robotic Operative System (ROS) Environment, which allows to implement the previous algorithms as different nodes. Theoretical analysis, simulations and in system measurements proved the effectiveness of the proposed solution.
Keywords: { Robotic Arm, Power Wheelchair, Visual Servoing, PBVS, Eye-in-Hand, Computer Vision, SIFT, Features extraction, PTAM, ROS, Human Machine Interface, Assistive Technology, Open-source}