Bionics Lab › Research > Wearable Robotics > Exoskeleton Prototype 3

Exoskeleton Prototype 3 (EXO-UL3)

 

Integrating human and robot into a single system offers remarkable opportunities for creating a new generation of assistive technology for both healthy and disabled people. Humans possess naturally developed algorithms for control of movement, but they are limited by their muscle strength. In addition, muscle weakness is the primary cause of disability for most people with neuromuscular diseases and injuries to the central nervous system. In contrast, robotic manipulators can perform tasks requiring large forces; however, their artificial control algorithms do not provide the flexibility to perform in a wide range of fuzzy conditions while preserving the same quality of performance as humans. It seems therefore that combining these two entities, the human and the robot, into one integrated system under the control of the human, may lead to a solution that will benefit from the advantages offered by each subsystem.

The exoskeleton robot, serving as an assistive device, is worn by the human (orthotic) and functions as a human-amplifier. Its joints and links correspond to those of the human body, and its actuators share a portion of the external load with the operator. One of the primary innovative ideas of the proposed research is to set the Human Machine Interface (HMI) at the neuromuscular level of the human physiological hierarchy using the body's own neural command signals as one of the primary command signals of the exoskeleton. These signals will be in the form of processed surface electromyography (sEMG) signals, detected by surface electrodes placed on the operator's skin. The proposed HMI takes advantage of the electro-chemical-mechanical delay, which inherently exists in the musculoskeletal system, between the time when the neural system activates the muscular system and the time when the muscles generate moments around the joints. The myoprocessor is a model of the human muscle running in real-time and in parallel to the physiological muscle. During the electro-chemical-mechanical time delay, the system will gather information regarding the physiological muscle’s neural activation level based on processed sEMG signals, the joint position, and angular velocity, and will predict using the myoprocessor the force that will be generated by the muscle before physiological contraction occurs. By the time the human muscles contract, the exoskeleton will move with the human in a synergistic fashion, allowing natural control of the exoskeleton as an extension of the operator's body.

The goal of this research is to design, build, and study the integration of a powered exoskeleton controlled by myosignals for the human arm. The research will pursue this goal through several objectives: (i) developing an 8 degrees of freedom powered anthropomorphic exoskeleton for the arm, including grasping/releasing; (ii) setting the HMI at the neuromuscular level by using processed sEMG signals as the primary command signal to the exoskeleton system; (iii) developing muscle models (myoprocessor) for predicting the human arm joints' torques; (iv) developing control algorithms that will fuse information from multiple sensors and will guarantee stable exoskeleton operation; (v) evaluating the overall performance of the integrated system using standardized arm/hand function tests. These goals and objectives will be pursued using several experimental protocols aimed at developing the myoprocessors and evaluating the exoskeleton performance. The proposed experimental protocol includes only healthy subjects as the first step in a long-term goal aimed to evaluate the exoskeleton performance with disabled subjects suffering from various neurological disabilities, such as stroke, spinal cord injury, muscular dystrophies, and other neurodegenerative disorders.

 

Figure : Multi degrees of freedom (DOF) conceptual model of the upper limb exoskeleton (The additional DOF that will allow hand grasping is not illustrated). The black color represents links, the red color represents powered (actuated) joints, and the green color represents multi axes force sensors.

It is anticipated that the proposed research will advance the current knowledge in the field of modeling human muscles and their mathematical formulation. This knowledge will be further used to create a novel HMI and will permit a better understanding of the interaction between human and robot at the neural level. In addition, the proposed research will provide a tool and fundamental understanding regarding the development of an assistive technology for improving the quality of life of the disabled community. The proposed scientific activity will promote interdisciplinary collaboration between students and faculty members from the fields of electrical engineering, mechanical engineering, bioengineering, and rehabilitation medicine.

 

 

High Resolution Photos

Exoskeleton – One Arm (Jacob Rosen)
Exoskeleton – Two Arms (Joel Perry)
Exoskeleton – Two Arms (Jocob Rosen) - Photo Credit: Jim Mackenzie



Projects


Virtual Worlds / Games With Haptics Utilizing Microsoft Robotic Studio Toolbox

Device: EXO-UL7
Methodology: Robotics Studio - Microsoft
Status: Active Research

The Human Arm Kinematics and Dynamics During Daily Activities Toward a 7 DOF Upper Limb Powered Exoskeleton

Device: Vicon System (Real-Time)
Methodology: Human Subjects -Arm Kinematics / Dynamics
Status: Completed



PUblications


(*) Note: Most of the Bionics Lab publications are available on-line in a PDF format. You may used the publication's reference number as a link to the individual manuscript.


Cavallaro E., J. Rosen, J. C. Perry, S. Burns, B. Hannaford, Hill Based Model as a Myoprocessor for a Neural Controlled Powered Exoskeleton Arm – Parameter Optimization, Proceedings of the 2005 IEEE international Conference on Robotics and Automation, ICRA 2005, pp. 4525 – 4530, Barcelona Spain, April 2005 [ CP18]

Rosen J,, J. C. Perry, N. Manning, S. Burns, B. Hannaford, The Human Arm Kinematics and Dynamics During Daily Activities – Toward a 7 DOF Upper Limb Powered Exoskeleton, - ICAR 2005 – Seattle WA, July 2005. [ CP19]

Perry J.C., J. Rosen, Design of a 7 Degree-of-Freedom Upper-Limb Powered Exoskeleton Proceedings of the 2006 BioRob Conference, Pisa, Italy, February, 2006. [ CP24]

Cavallaro E., J. Rosen, J. C. Perry, S. Burns, Myoprocessor for Neural Controlled Powered Exoskeleton Arm, IEEE Transactions on Biomedical Engineering, pp. 2387-2396, Vol. 53, No. 11, November 2006 [ JP 12]

Perry J. C., J. Rosen, S. Burns, Upper-Limb Powered Exoskeleton Design, IEEE Transactions on Mechatronics, Volume 12, No. 4, pp. 408-417, August 2007 [ JP 13]

Rosen J., and J.C. Perry, Upper Limb Powered Exoskeleton, Journal of Humanoid Robotics, Vol. 4, No. 3 (2007) 1–20 [ JP15]

 



Multimedia


Exoskeleton Prototype 3 - 7 Degrees of Freedom (DOF)
The exoskeleton arm includes 7 DOF which support the 7 DOF of the human arm. The exoskeleton operational workspace covers 95% of the human arm workspace.  



Exoskeleton Prototype 3 - Gravity Compensation Algorithm
The weight of the exoskeleton arm is compensated with a gravity compensation algorithm. Form the operator perspective the exoskeleton arm is weightless.


Exoskeleton Prototype 3 - Null Space
Positioning and orienting an object in 3 dimensional space requires 6 DOF (3 DOF – translation and 3 DOF orientation). The human arm as well as the exoskeleton are redundant mechanism including 7 DOF – one additional DOF then is needed to position and orient an object in space. As a result these systems include a “null space”. A space in which the hand (end effector) remains in a fixed position and orientation but the arm may have infinite configurations or infinite solutions to the inverse kinematics problem. The video clip demonstrates the null space of the exoskeleton.

Exoskeleton Prototype 3 - Virtual Reality
This clip demonstrates the application of the exoskeleton as a haptic device. A user utilizes the upper limb exoskeleton (EXO-UL7) to interact with a virtual ball . The interaction in the virtual environment generates a force between the virtual human figure and the virtual ball. This force is rendered, generated by the physical exoskeleton arm, and applied on the user arm. As a result the user feels the haptic interaction as if he would have felt by interacting with a real ball. The virtual environment was developed with Microsoft Robotic Studio Toolbox.