Meta has announced several new initiatives to enhance accessibility across its products, coinciding with Global Accessibility Awareness Day. The company aims to improve access for people with disabilities through technology and innovation.
Ray-Ban Meta glasses have been designed to assist the blind and low vision community by offering a hands-free experience. These glasses allow users to capture and share photos, send messages, make calls, listen to music, translate speech in real-time, and interact with Meta AI for immediate assistance. "Since launching Ray-Ban Meta, people have captured and shared millions of moments with loved ones," Meta stated.
A new feature is being introduced that enables customization of Meta AI on Ray-Ban Meta glasses. This feature will provide detailed responses based on the user's environment and will initially be available in the U.S. and Canada before expanding globally. Additionally, the Call a Volunteer feature will soon be launched in 18 countries where Meta AI is supported. Developed in partnership with Be My Eyes, this feature connects visually impaired individuals with sighted volunteers for real-time assistance.
Meta is also working on wristband devices using surface electromyography (sEMG) technology to facilitate human-computer interactions for those with physical disabilities such as hand paralysis or tremors. These wristbands can detect muscle signals at the wrist, providing control even if large movements are not possible due to conditions like spinal cord injuries or strokes.
Research efforts include collaboration with Carnegie Mellon University and data collection through a Clinical Research Organization to evaluate sEMG-based models for computer controls among individuals with hand tremors or paralysis.
In addition to hardware innovations, Meta is focusing on communication accessibility within the metaverse by offering live captions and live speech features in its extended reality products. Live captions convert spoken words into text in real-time, while live speech transforms text into synthetic audio for users who prefer non-verbal interaction.
Meta's open-source AI models collection, Llama, is being utilized by developers at Sign-Speak to create a WhatsApp chatbot that translates American Sign Language (ASL), facilitating communication between Deaf individuals and hearing people.
"We’re committed to investing in features and products that make connection easier for all," stated Meta as it continues its efforts to address accessibility needs worldwide.