I have a degenerative eye disease called keratoconus. It’s a gradual thinning of the cornea, causing blurred vision and light halos as the condition progresses. While I can do most things normally, there are various workarounds I use to help. That includes an app called Be My Eyes. First released in 2015, Be My Eyes connects visually impaired users to sighted volunteers through a video call. Once connected, the volunteer helps guide the user through whatever task they need assistance with. I've used the app to find my gate at airports and navigate small print. Occasionally, I’d use Be My Eyes at restaurants, the volunteer playfully editorializing on beer selection as they’d list off names from an overhead chalkboard. We’d end up debating the deliciousness of different IPAs.
There is something legitimately special about using the app, especially the first few times. Connecting the visually impaired with sighted volunteers offers a real opportunity for empathy and a better understanding of the everyday realities of vision loss. Still, on a practical level, Be My Eyes depends on the availability of volunteers and their competency navigating users through a task. It makes the app feel more like a fun novelty than a tool to rely on. With a new update, that’s about to change.
This winter, Be My Eyes is rolling out a digital assistant called Be My AI, which uses artificial intelligence. Users can open the app and point their phones toward what’s in front of them. The phone then offers a detailed verbal description of the setting and other relevant information like furnishings, vehicles, people, or appliances. Users can ask follow-up questions if they need more detailed information or a specific question answered.
Powered by GPT-4, OpenAI’s large multimodal model, Be My AI can serve the same function as a human volunteer for most basic tasks, while also offering better privacy and a more consistent experience. It’s not a replacement for a cane or guide dog, but for people with sight loss, Be My Eyes offers a level of autonomy and independence that previously seemed impossible. It’s now available for users in beta.
For all the fear around artificial intelligence and the valid criticisms of emerging technology, Be My AI is an obvious net positive. A robot reads for me when I’m not able to. That’s the kind of practical use for advanced tech promised to us in science fiction–a lot more encouraging, to me at least, than AI’s ability to write parodies of The Office in the style of Ernest Hemingway. The app helps me do my work more independently, navigating spaces on the road when I’m traveling solo. Lately, using the app has me thinking a lot about AI, accessibility, and working in non-traditional environments.
For all the fear around artificial intelligence...Be My AI is an obvious net positive. A robot reads for me when I'm not able to.
While assistive tech has always been imperative for folks with sight loss—screen magnifiers, text-to-speech functions, dark mode, and braille displays—advances in AI have helped break accessibility barriers in the workplace. That’s especially true when it comes to remote work. Virtual assistants like Siri and Alexa can help navigate computers and phones, while programs like Microsoft Copilot offer image description tools that give a detailed account of what’s happening in a picture, a particularly useful tool if an image doesn’t have alt text.
Many folks with access needs have also been using AI tools in new and innovative ways: take broadcaster Steven Scott, the co-host of Access Tech Live. His weekly broadcast addresses technology from an accessibility perspective, giving individuals with disabilities an inside look at how they might use different tools. It’s a smart and entertaining vantage point on this quickly-moving space from someone in the know. And Scott uses the tools himself, integrating AI into his prep work for the show.
While sighted individuals can quickly scan through articles on the internet, Scott relies on a screen reader to parse through the news. That can be a slow process. Where a sighted person might be able to skim an article in seconds for relevant information, listening to each story in its entirety can take hours. Instead, Scott feeds links into ChatGPT to get concise summaries of articles. Then he digs deeper if he needs more information on the subject. ChatGPT allows him to draft scripts significantly faster, the innovation built out of necessity.
Scott also uses the AI-crafted summaries to cut out the noise of today’s internet. With each website barraging users with pop ups—sign up for our newsletter! Visit our sponsors! Give us your first born!—navigating those spaces can be a real barrier.
Advances in AI have helped break accessibility barriers in the workplace. That's especially true when it comes to remote work.
“ChatGPT helps strip all that nonsense out,” says Scott. “It’s allowed me to enjoy writing again. I’ll ask GPT a question based on a web search. I’ll ask it to focus on this, or tell me about this. And from there I can start to build my scripts.”
Like many, Scott now works from home. A few short years ago that idea seemed far out, at least for most jobs. But nowadays going into the office might seem odd for some. The work-from-home trend and the flood of new technologies is a godsend for many folks with disabilities. Scott is hopeful that trend will continue.
“AI is moving in such a way that it feels to me that the operating system of the future will not really require much more than a swipe of the hand, or a voice command to enact what we want it to do. Can you imagine the difference that would make for disabled people? Some of that is already starting now.”
This isn’t to say that everything is perfect as is. On a personal level, one of the major struggles with working remotely as a freelancer has been getting onboarding materials, invoices, and documents in formats that are accessible to me. Similarly, common work tools that use collaboration boards to display information or assign tasks to workers can be a headache to learn and navigate, my zoom tools working overtime just to keep up. New technologies—including those powered by AI—have certainly helped with my vision challenges, but successful integration also requires a buy-in from team members and an overall willingness to learn.
Successful integration requires a buy-in from team members and an overall willingness to learn.
“There's a responsibility in all of us to face up to what our problems are,” says Scott. “I acknowledge the challenges and work through those, but we can only do that together as a community, as far as I'm concerned,” he says.
Nearly 1.8 million people with a disability have joined the labor force since just before the pandemic hit the U.S., a 28% increase, according to the Labor Department. One of the major reasons is remote work. It’s an encouraging statistic, especially considering that folks with access needs have a historically higher rate of unemployment than those without. New tools backed by AI will help continue to bridge that gap, pointing to a more equitable future. But more than anything, I think it’s important to remember that just because a person may need to do something differently (or may need assistance to get something done) it doesn’t mean they can’t do it at all. Remote work allows more disabled people access to good jobs, but it’s up to all of us to create the space for employees to succeed. With tools like Be My AI and other emerging assistive tech, that’s getting easier and easier to do.