![]() ![]() ![]() Brain-Computer Interface (BCI) is utilized brain activity (i.e., EEG signals) to control external devices, such as typing words by selecting letters on a digital keyboard 6 or performing complex tasks such as browsing a web page 7 or painting an image 8. Thoughts and intentions are another communication approach for patients with speech impairments. The head-mounted eye gaze trackers 5 required some static, adjusted settings according to the camera and patient’s eye during the head movement. Eye Transfer 4 (E-tran) board is an alternative low-cost solution ($260) where a caregiver holds a transparent plastic board of printed letters and observes the patient’s eye gestures on the board. Commercial gaze-sensing keyboards are very expensive for example, Tobii Dyanvox 3 has a cost ranging from 5K$ to 10K$ according to different configuration models. Translating eye gestures into a communicated speech invented a plethora of Augmentative/Alternative Communication (AAC) devices that have different designs and usability concepts ranging from control panels with letters and numbers, touch and gaze-sensing screens, eye tracking systems and consequently modified mouse cursor techniques are introduced to control different computer applications. Patients can communicate with their caregivers in the later disease stages through eye gestures 1, 2. The software and its source are available from the GitHub repository ( ).Īmyotrophic Lateral Sclerosis, ALS, and Primary Lateral Sclerosis, PLS are progressive neuron diseases that affect the brain and spinal cord cells and gradually cause the loss of muscle control and develop speech impairment symptoms. Unlike the other sensor-based eye-tracking systems, Blink-To-Live is simple, flexible, and cost-efficient, with no dependency on specific software or hardware requirements. A prototype of the Blink-To-Live system is evaluated using normal cases with different demographic characteristics. Once the eye gestures encoded sentences are generated, the translation module will display the phrases in the patient’s native speech on the phone screen, and the synthesized voice can be heard. These eye gestures encode more than 60 daily life commands expressed by a sequence of three eye movement states. There are four defined key alphabets in the Blink-To-Live eye-based communication language: Left, Right, Up, and Blink. A mobile phone camera tracks the patient’s eyes by sending real-time video frames to computer vision modules for facial landmarks detection, eye identification and tracking. Blink-To-Live is an eye-tracking system based on a modified Blink-To-Speak language and computer vision for patients with speech impairments. Most invented eye-based tracking systems are complex and not affordable in low-income countries. Eye-based communication languages such as Blink-To-Speak play a key role in expressing the needs and emotions of patients with motor neuron disorders. ![]()
0 Comments
Leave a Reply. |