MIT News (02/26/09) Trafton, Anne
Researchers in the Massachusetts Institute of Technology’s (MIT’s) Sensory Communication Group are working on tactile devices that translate sound waves into vibrations that can be felt in the skin, which could help deaf people who rely on lip reading.
MIT research scientist and project leader Charlotte Reed says the software they are developing could be compatible with modern smart phones, enabling the devices to act as unobtrusive tactile aids for the deaf.
Most smart phones already have a microphone, digital-signal-processing capability, and a rudimentary vibration system. Tactile devices translate sound waves into vibrations, enabling users to distinguish between different sound frequencies based on which vibration pattern they feel. Tactile aids have existed for years, and current models can be held in the user’s hand or worn on the back of the neck.
However, the MIT researchers want to improve the devices by refining the acoustic signal processing systems to provide tactile cues that are tailored to boost lip-reading abilities. The researchers have done several studies on the frequency reception ability of the skin, which can best detect frequencies below 500 hertz.
The researchers also have done preliminary studies on deaf people’s ability to interpret the vibrations from tactile devices, including a device that can provide distinct vibration patterns to three fingers simultaneously. “Anyone who has a smart phone already has much of what they would need to run the program,” notes graduate student Ted Moallem.