LookTel is a smartphone object recognition package which is intended to help the visually impaired correctly identify such things as money, CD titles, landmarks and so on. An image captured by the phone’s camera is sent to a PC, which then quickly scans through a database and when a match is made, the result is returned to the phone and spoken to the user. Other features include a sighted user assist, a handy text-to-speech function and guided interface control.
LookTel will be made available in beta in the Spring, and will provide those who suffer from impaired vision with a little more independence. The system is said to work on any Windows Mobile smartphone and turns the device’s camera into an object scanner.
BaseStation software receives a signal from the phone, and then searches through an image library until it finds a match. For items around the home that can’t easily be determined by packaging or design, such as glass jars or plastic containers, special labels are available allowing the user to tag such things up and then record a custom description which is stored in the software’s database.
[adsense]
Once identified, the result is sent back to the phone and the user is told what’s in front of them through the smartphone’s internal speaker. The system can be taught to recognize all the objects and landmarks you wish to identify, and with some assistance from a sighted helper, can become a helpful assistant for many tasks where vision makes a difference in one’s independence.
If a user is out and about and gets into trouble, a live video feed can be sent to a sighted person for assistance. As well as seeing the user’s surroundings, the helper is also provided with a position on Google Maps via the GPS capabilities of the smartphone so they will be able to either offer turn by turn guidance to get the user to safety or come to the rescue themselves.
With phones that support the speech recognition features of Microsoft’s Voice Command, LookTel can help initiate calls or ask for the time and so on. The company has also worked on ways to turn a smartphone’s touchscreen user interface into a voice-guided tactile experience, identifying menu items as a finger slides around a display grid and then allowing the user double-tap activation.
Published on March 30, 2010