Family BCI Insights

UCSF/Berkeley BCI restores speech to locked-in patient after 18 years at ~80wpm

UCSF/Berkeley BCI restores speech to locked-in patient after 18 years at ~80wpm

Key Questions

What breakthrough did UCSF and Berkeley achieve with locked-in patient Ann Johnson?

UCSF and Berkeley developed a brain-computer interface (BCI) that enables Ann Johnson, a locked-in stroke patient silent for 18 years, to produce speech at about 80 words per minute via a personalized avatar and text output. This patient-specific decoder demonstrates bedside viability for communication in ALS and locked-in syndrome patients.

How does this BCI compare to previous brain-to-speech technologies?

It echoes advances like the Chang lab's vPCG and UC Davis brain-to-speech systems, including the recent Casey Harrell case where an ALS patient regained the ability to speak to his daughter via a brain implant. The technology reinforces these developments with reports of high-speed speech generation.

What is the current status of this UCSF/Berkeley BCI technology?

The technology is still in development, pending peer-reviewed data on information transfer rate (ITR), bits per minute (bpm), accuracy, stability, safety metrics, and generalization to ALS patients. It has been supported by recent duplicate reports, highlighting its potential for stroke and locked-in communication recovery.

UCSF/Berkeley BCI enables locked-in stroke patient Ann Johnson to produce speech via personalized avatar/text at ~80wpm after 18 years of silence. Patient-specific decoder hits bedside viability for ALS/locked-in comms, echoing Chang vPCG and UC Davis brain-to-speech advances (incl. recent Casey Harrell family story). Pending peer-reviewed ITR/bpm, accuracy/stability/safety metrics, and ALS generalization data. Reinforced by recent duplicate reports.

Sources (2)
Updated May 1, 2026
What breakthrough did UCSF and Berkeley achieve with locked-in patient Ann Johnson? - Family BCI Insights | NBot | nbot.ai