Google-inspired smart glasses for the blind add some eye-catching new features – here’s how it works

AI-powered smart glasses, custom-designed for the blind and visually impaired, caused a stir in the adaptive technology industry when Envision, an award-winning assistive technology innovator, launched the sleek, infused glasses technology at the CSUN 2020 conference.

Today, Envision announced that it has packed its high-tech eyewear with all new eye-catching features that will improve the everyday lives of visually impaired users.

Envision updates its AI-powered smart glasses, tailor-made for the visually impaired

Envision’s AI-powered smart glasses, tailor-made for the visually impaired, were built on Google Glass Enterprise Edition 2. Since their launch in 2020, Envision says its high-tech glasses have changed the lives of hundreds of people. blind and visually impaired people. global.

Consider

Consider (Image credit: Envision)

I know what you’re thinking. “How do Envision’s AI-powered smart glasses work?” As the name suggests, it relies on artificial intelligence to extract information about targeted images and text – and it tells the user what they “see”. This allows visually impaired users to read work documents, recognize loved ones, find nearby belongings, use public transport, and more.

The AI ​​glasses come with a companion app on Android and iOS. It can read and translate any type of text — handwritten and digital — from any surface, whether it’s a computer screen, a timetable, a food label or a a poster. It also recognizes objects, colors and faces. Heck, it can even describe scenes for users. For example, if there is a birthday cake with lit candles placed in front of the user, the AI-powered smart glasses will describe what they “see”.

Consider

Consider (Image credit: Envision)

“Our mission is to improve the lives of the two billion people who are blind or partially sighted worldwide by providing them with life-changing technologies, products and assistive services,” said Karthik Kannan, co-founder of Envision. “By analyzing real-time user data and direct feedback from all of our communities, we are able to constantly enrich the Envision experience and innovate our products.”

As mentioned, Envision has announced the next generation of its AI-powered smart glasses. Here are the new updated features added to the high-tech assistive glasses:

  • Document guide for accurate capture – Eliminates the frustration of taking multiple images to fully capture the entire text of a document. Enhanced document guidance offers verbal instructions, guiding users on how to position documents for optimal scanning position. With this new feature, users can capture documents in one motion.
  • Layout detection – Envision smart glasses now put documents in context for users via verbal guidance. It recognizes photo captions, headers and more.
  • Improved offline language capabilities – Envision added four additional languages: Japanese, Hindi, Chinese, and Korean. The total number of supported languages, when offline, is 26. When online, that number jumps to over 60.
  • Support for the development of third-party applications – Developers can now participate in Envision’s third-party ecosystem, allowing them to build apps and other services that add value to AI-powered smart glasses. Through its partnership with the Cash Reader app, Envision can now recognize over 100 currencies.
  • Build an ally – The Ally function, the most popular feature of smart glasses that allows users to communicate with trusted contacts via video conferencing, has been upgraded. The new improved version is now optimized for mobile networks and WiFi hotspots.
  • Optimized Optical Character Recognition (OCR) – Consider dramatically improved image capture and interpretation accuracy.

The AI-powered smart glasses have a suggested retail price of $3,500; they can be bought directly from Envision or through his worldwide network of distributors.

Sarah C. Figueiredo