Virtual integration environment as an advanced prosthetic limb training platform

Briana N. Perry, Robert S. Armiger, Kristin E. Yu, Ali A. Alattar, Courtney W. Moran, Mikias Wolde, Kayla McFarland, Paul F. Pasquina, Jack W. Tsao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

24 Scopus citations

Abstract

Background: Despite advances in prosthetic development and neurorehabilitation, individuals with upper extremity (UE) loss continue to face functional and psychosocial challenges following amputation. Recent advanced myoelectric prostheses offer intuitive control over multiple, simultaneous degrees of motion and promise sensory feedback integration, but require complex training to effectively manipulate. We explored whether a virtual reality simulator could be used to teach dexterous prosthetic control paradigms to individuals with UE loss. Methods: Thirteen active-duty military personnel with UE loss (14 limbs) completed twenty, 30-min passive motor training sessions over 1-2 months. Participants were asked to follow the motions of a virtual avatar using residual and phantom limbs, and electrical activity from the residual limb was recorded using surface electromyography. Eight participants (nine limbs), also completed twenty, 30-min active motor training sessions. Participants controlled a virtual avatar through three motion sets of increasing complexity (Basic, Advanced, and Digit) and were scored on how accurately they performed requested motions. Score trajectory was assessed as a function of time using longitudinal mixed effects linear regression. Results: Mean classification accuracy for passive motor training was 43.8 ± 10.7% (14 limbs, 277 passive sessions). In active motor sessions, >95% classification accuracy (which we used as the threshold for prosthetic acceptance) was achieved by all participants for Basic sets and by 50% of participants in Advanced and Digit sets. Significant improvement in active motor scores over time was observed in Basic and Advanced sets (per additional session: β-coefficient 0.125, p = 0.022; β-coefficient 0.45, p = 0.001, respectively), and trended toward significance for Digit sets (β-coefficient 0.594, p = 0.077). Conclusions: These results offer robust evidence that a virtual reality training platform can be used to quickly and efficiently train individuals with UE loss to operate advanced prosthetic control paradigms. Participants can be trained to generate muscle contraction patterns in residual limbs that are interpreted with high accuracy by computer software as distinct active motion commands. These results support the potential viability of advanced myoelectric prostheses relying on pattern recognition feedback or similar controls systems.

Original languageEnglish
Article number785
JournalFrontiers in Neurology
Volume9
Issue numberOCT
DOIs
StatePublished - 17 Oct 2018
Externally publishedYes

Keywords

  • Modular prosthetic limb
  • Myoelectric prostheses
  • Neurorehabilitation
  • Pattern recognition control
  • Surface electromyography (semg)
  • Upper extremity amputation
  • Virtual integration environment
  • Virtual reality therapy

Fingerprint

Dive into the research topics of 'Virtual integration environment as an advanced prosthetic limb training platform'. Together they form a unique fingerprint.

Cite this