Neural Controlled Prosthetics

ECE Associate Professors Denis Erdogmus, Gunar Schirner, & Taskin Padir awarded $603K NSF grant to develop a Hand Augmentation using Nested Decision (HAND) for those with lost limb function. In this collaborative research grant, Northeastern University will be working with Spaulding and WPI on "Nested Control of Assistive Robots through Human Intent Inference".


Abstract Source: NSF

Part 1: Upper-limb motor impairments arise from a wide range of clinical conditions including amputations, spinal cord injury, or stroke. Addressing lost hand function, therefore, is a major focus of rehabilitation interventions; and research in robotic hands and hand exoskeletons aimed at restoring fine motor control functions gained significant speed recently. Integration of these robots with neural control mechanisms is also an ongoing research direction. We will develop prosthetic and wearable hands controlled via nested control that seamlessly blends neural control based on human brain activity and dynamic control based on sensors on robots. These Hand Augmentation using Nested Decision (HAND) systems will also provide rudimentary tactile feedback to the user. The HAND design framework will contribute to the assistive and augmentative robotics field. The resulting technology will improve the quality of life for individuals with lost limb function. The project will help train engineers skilled in addressing multidisciplinary challenges. Through outreach activities, STEM careers will be promoted at the K-12 level, individuals from underrepresented groups in engineering will be recruited to engage in this research project, which will contribute to the diversity of the STEM workforce.

Part 2: The team previously introduced the concept of human-in-the-loop cyber-physical systems (HILCPS). Using the HILCPS hardware-software co-design and automatic synthesis infrastructure, we will develop prosthetic and wearable HAND systems that are robust to uncertainty in human intent inference from physiological signals. One challenge arises from the fact that the human and the cyber system jointly operate on the same physical element. Synthesis of networked real-time applications from algorithm design environments poses a framework challenge. These will be addressed by a tightly coupled optimal nested control strategy that relies on EEG-EMG-context fusion for human intent inference. Custom distributed embedded computational and robotic platforms will be built and iteratively refined. This work will enhance the HILCPS design framework, while simultaneously making novel contributions to body/brain interface technology and assistive/augmentative robot technology. Specifically we will (1) develop a theoretical EEG-EMG-context fusion framework for agile HILCPS application domains; (2) develop theory for and design novel control theoretic solutions to handle uncertainty, blend motion/force planning with high-level human intent and ambient intelligence to robustly execute daily manipulation activities; (3) further develop and refine the HILCPS domain-specific design framework to enable rapid deployment of HILCPS algorithms onto distributed embedded systems, empowering a new class of real-time algorithms that achieve distributed embedded sensing, analysis, and decision making; (4) develop new paradigms to replace, retrain or augment hand function via the prosthetic/wearable HAND by optimizing performance on a subject-by-subject basis.

Related Faculty: Deniz Erdogmus, Gunar Schirner

Related Departments:Electrical & Computer Engineering