News & Stories
Q&A: Using software engineering to bring back speech in ALS
Erin Kunz took a circuitous route to studying the brain. The third year PhD student in Electrical Engineering started her career developing autonomous vehicles at General Motors (GM) — but now she uses her software engineering and machine learning skills in the Neural Prosthetics Translational Laboratory at Stanford, led by faculty members Jamie Henderson and the late Krishna Shenoy.
Neural prostheses, also called brain-computer interfaces (BCIs), are an emerging technology intended to restore neural functions that have been lost through illness or injury. The newest generation of BCIs are implanted directly into the brain and can both deliver electrical signals—to stimulate specific nerves—and record the brain’s electrical activity. In patients who have lost the ability to move their upper or lower bodies due to stroke or spinal cord injury, BCIs are being developed to restore limb movement, and allow use of phones and iPads. In other settings, BCIs are being used to facilitate communication in the form of artificial ears and handwriting decoder devices.