Assistive technology company Envision has unveiled its latest version of Smart Glasses. The product uses AI technology to assist the blind.
Debuted at the CSUN Assistive Technology Conference earlier this month, the glasses are paired with an iOS/Android app furing set up, but function independently going forward. Software runs on a Qualcomm Quad Core processor, within Google Glass Enterprise Edition 2 hardware.
The smart glasses use AI to organise different types of information from visual cues and verbally translate the information to the user. The wearable tech identifies friends, reads documents aloud, and can find missing items in the house, and can also help the wearer use public transport.
The latest version of the glasses includes new features such as; .
Accurate text reading: Smart Glasses can read and translate digital and handwritten texts from various sources, including computer screens, posters, barcodes, timetables, and food packaging, barcodes.
Optimized Optical Character Recognition (OCR): It uses tens of millions of data points to be interpreted by Envision Glasses and Apps for accurate image capture.
Third-Party App Integration: Envision created an app ecosystem, making it easier for its software to integrate with external services, such as outdoor and indoor navigation. It can also recognize more than 100 currencies with the Cash Reader app.
Ally function: A secure video calling capability allows users to ask for help from contacts, using both Wi-Fi and mobile networks.
Language Capabilities: Four new Asian languages have been added, bringing the total number of supported languages to 60 when connected. There are 26 supported languages when offline.
Layout Detection: Smart Glasses can contextualise a document, making it less confusing for the user to read a food menu, newspaper, road sign or a poster.
top photo: Sadjad Frogh