Clinical Trials Logo

Clinical Trial Summary

The purpose of this study is to evaluate a new control (i.e., the vision-guided shared control) for a wheelchair-mounted assistive robotic manipulator among powered wheelchair users. This study will consist of a questionnaire about general demographics, health information, and previous experience with assistive technology. Several tests will also be administered to test upper extremity function and ability as well as to test spatial orientation and visualization ability. Participants till then undergo a training phase with the assistive robotic manipulator mounted on a table to assess if they will be eligible for participation in the study. Eligible participants will move on to a second training phase where they will be asked to learn and practice slightly more complex tasks while using the vision-guided shared controller. After this training the assistive robotic manipulator will be mounted to the participants wheelchair and they will be asked to complete a number of everyday tasks from a task list. At the conclusion of the study, researchers will conduct a brief semi-structured interview with each participant and obtain more insight on how participants perceive the ease-of-use and usefulness of the vision-guided shared control.


Clinical Trial Description

Veterans who use powered mobility devices including those with high-level spinal cord injury (SCI), amyotrophic lateral sclerosis (ALS), and multiple sclerosis (MS) often experience serious upper extremity impairments. Management and care of upper extremity impairments often involve a range of assistive solutions. However, product availability and technological advancement for manipulation assistance fall far behind those for mobility. Many of these individuals, despite their independent mobility, cannot reach for a glass of water, make a simple meal, and pick up a tooth brush. They still require assistance from a personal caregiver for essential activities of daily living (ADLs) involving reaching and object handling/manipulation. With the rapid advancement of robotics technology, assistive robotic manipulators (ARMs) emerge as a viable solution for assisting Veterans with upper extremity impairments to complete daily tasks involving reaching, object handling, and manipulation. ARMs are often equipped with many degrees of freedom (DOF), but users cannot control all of the DOFs at the same time with a conventional joystick, and need to switch modes quite often to complete even simple manipulation tasks, especially when an ARM gets close to the target and need to be aligned appropriately for manipulation. Thus existing ARMs suffer from the lack of efficiency and effectiveness especially in an unstructured environment. The goal of this project is to combine vision-guided shared (VGS) control with two types of environment modifications to address the effectiveness and efficiency of ARMs for real-world use. The two types of environment modifications include using commercial or custom adaptive tools (e.g., a holder that can hold a bottle or jar so an ARM can open it), and adding fiducial markers (similar to QR codes) to objects or adaptive tools to make vision-based tracking robust and reliable for real-world applications. Built upon the environment modifications, the VGS control will allow a user to initiate any task by moving an ARM close to a tagged object, and the ARM to take over fine manipulation upon detecting the target. This project is to evaluate the new control among 16 powered wheelchair users who will use a wheelchair-mounted ARM to complete a set of everyday manipulation tasks. Participants will complete a set of 10 manipulation tasks using the default control method and the new VGS control method. Researchers will collect outcome measures in terms of efficiency (i.e., task completion time and mode switching frequency), effectiveness (i.e., task completion success rate), and usability (i.e., NASA Task Load Index, and System Usability Scale). Investigators expect to improve manipulation functions of Veterans with upper limb impairments through a more practical and usable implementation of vision-based robotic control and human-robot interaction technologies. ;


Study Design


Related Conditions & MeSH terms


NCT number NCT04323449
Study type Interventional
Source VA Office of Research and Development
Contact Dan Ding, PhD
Phone (412) 688-6000
Email dad5@pitt.edu
Status Recruiting
Phase N/A
Start date April 13, 2022
Completion date June 30, 2024