Tag Archive

assistive technology

Building the Perfect Access Technology Partnership

Building the Perfect Access Technology Partnership

By Erin Lauridsen, Director of Access Technology

I’m Erin Lauridsen, I’m blind and proud of it, which means that I am profoundly personally invested in my work in digital accessibility. In the course of my career so far, I’ve worked with many companies at all points along their accessibility journeys. In the course of this work, I’ve at times encountered openness and innovation, but at other times, I have encountered friction born of a lack of cultural competence around disability. On this tenth Global Accessibility Awareness Day, I want to share with you the keys to bringing disability cultural competence to your accessibility work, as I see them. Whether you work in compliance, user experience, marketing, engineering, or leadership, these are reflections from my lived experience of disability, and the ways it influences and is impacted by my work in corporate accessibility and how we work together.

I introduced myself to you as blind. I can’t tell you how many times I’ve heard hesitation in the voice of someone approaching a conversation with me about blindness or accessibility, because they are afraid of using the wrong word, or afraid naming my disability will be seen as negative. People with disabilities use many different terms to describe our identities, or in some cases, we may not consider disability to be central to our identity at all. In the vision world, you may hear ‘blind’, which can represent anything from total blindness to a decrease in visual acuity significant enough to impact reading or driving. You may hear ‘visual impairment’, a clinical term for the same range. You may also hear ‘low vision’, a term to represent the less than perfect but still usable range of visual functioning, and you may hear ‘vision loss’, a term common among older adults, or those who have acquired blindness later in life. Those are just the first four terms that come to mind, I have heard many others over the years, and this is just one of many disabilities you might discuss. Which term a person uses is influenced by their preferences, lived experience and cultural identity. For myself, blindness is a lifelong part of me, so vision loss doesn’t ring true. Visual impairment conjures memories of childhood appointments with professionals who wanted to cramp my carefree kid style with clinical evaluations, and I don’t have enough usable vision for low vision to fit well. I’m happy to call myself blind: it’s concise and makes me think of the affinity I feel when I hear another cane tapping down Bay area streets, or having a late night chat with other blind cooks about knife sharpening techniques. So, if you are entering or starting a conversation about disability, how on earth do you choose which term to use? Here’s my advice: when you are working with an individual, check in about preferred terms. Ask how they identify. When you are speaking or writing about specific disabilities, reach out to disability lead organizations and advocacy groups to learn about identity language. Be open to following our lead on language, even if it is language that feels new or uncomfortable to you.

I often navigate conversations grounded in someone’s fear or imaginings about what the lived experience of disability must be like. This often leads to over-engineering solutions or solving for a nonexistent or trivial problem. I have more than one story about a person coming to LightHouse with a multi-part camera and processor system for text recognition or object identification, who became defensive when learning that the blind people in the room can read text quickly with our phones, and wouldn’t be willing to carry around a bulky camera just so it could shout out “refrigerator” “toilet” “goat” as we encountered these things in our wanderings. Others have taken the time to listen deeply to how blind people read text and explore our environments, and the innovations they are working on will take current good solutions to the next level. If you are designing or coding for a disability that you do not live, check your assumptions with the community. Listen to what our friction points are, and work with us to identify good solutions.

In this work, I often must balance the need for disability awareness and education with jarring requests for personal disclosure. Once when I was explaining how being able to adjust brightness is useful to people with many different eye conditions, I used myself as an example of someone who does best with reduced glare. The researcher I was speaking with exclaimed, “Oh, is that why your eyes move that way!” I hope my next eye movement was an exasperated eye-roll, as we’d abruptly shifted from talking about how I customize technology settings to my needs, to talking about my body. If you are doing product research, or educating yourself about disability in the course of accessibility work, you can start by asking about tools and technologies rather than about medical diagnoses or the functional limitations of someone’s body. You can learn a lot more about how I use an app by asking what accessibility features I run on my phone than by asking what eye condition I have or how much I can see. Take the time to consider why you are asking a personal question, and in what setting you are posing it. While you might be curious about how I pick up after my guide dog, it really isn’t the best topic for our business lunch. However, if you want to innovate a solution to find trash cans on busy city streets, I’m all about sharing my dog walking routine in that context.

Often I hear that a company has designed or tested for accessibility by focusing on only one user with a disability. Perhaps they have a blind engineer on their team, or they may have connected with one end user of their product who has invested in giving them a great deal of feedback. While these are both wonderful things, neither is comprehensive, because disability intersects with every part of the human condition, and may create different challenges and opportunities based on those intersections. The skills and tools I use to navigate digital spaces are influenced by my economic privilege, my early access to education, and my linguistic and cultural background. Despite a preference for Braille reading, having had access to screen readers early in life has improved my ability to process complex web pages quickly using text to speech. The same task presents a significant hurdle for some of the adult learners I have worked with, especially those who are learning language or literacy skills along with digital access. You may have watched a blind coder execute complex keyboard sequences to control a screen reader, but an older adult with arthritis may be challenged by pressing multiple keys at once. Just as with any customer base, it’s important to avoid designing or remediating for one person or one persona. Have professional experts as well as end users with disabilities engaged in the design and testing of your products. Please do hire that blind engineer though, she’s spent her whole life innovating and hacking solutions for a world that often doesn’t consider her in the scope of design, and that skill set is going to make your product better.

Sometimes people reach out to me for help with an empathy lab or asking for a blindfold experience, and I do my best to help them find another way to learn. You can not try on the many intersections of a lived experience, and I can’t instill all the skills, culture, and adventures of a blind life by putting a blindfold on your face. Please avoid using empathy exercises that encourage you to try on a disability experience for a brief moment or a day. Instead, invest your time in learning from the lived experiences of people with disabilities, and learning about the tools and technologies we use. If you try a screen reader for a moment, you may find it challenging in the way that switching modalities can be challenging for anyone, but if you invest quality time in learning how screen readers work, you may discover, and then fix, a pain point with your product. Recognize that digital accessibility is not just a topic limited to your livelihood, but consider it as a way to build stronger communities and relationships throughout your life. For example, you can incorporate image descriptions in to your personal social media posts, not just your company’s website.

I hope these reflections will encourage you to take the next step on your personal or company accessibility journey. Ask yourself how you can more deeply engage with the people your accessibility work impacts, and take the next step to increase that dialogue. Whether you’re just beginning, or are part of a robust accessibility initiative, there is always more to learn. I hope I get to meet you along the way.

A Week with Be My Eyes: The First Truly Social Network

A Week with Be My Eyes: The First Truly Social Network

On May 11 from 5:00 t0 7:00 p.m., LightHouse will host Be My Eyes and its blind or low vision users for an evening of creative use, feedback and even a bit of friendly competition. The Be My Eyes team will take blind users through the past, present and future of the technology, and share some incredible stories about the iPhone app that connects blind people to a network of sighted volunteers via live video chat. The event is free and intended for blind and low vision users – RSVP on Facebook.

We love our independence. Even if our vegetables are grown and picked by hundreds of hands, our cars designed by teams of closely collaborating engineers, and everything from our electricity to our government benefits kept running by vast networks of individuals — modern day technology and consumption are designed to make us feel self sufficient.

We are thus allowed to hold ourselves ideals of self-determination and rugged individualism that have been passed down over the centuries. As blind people, these values are challenged every day of our lives. When something is poorly designed or downright unusable, we confront a deep conundrum: going it alone or asking for help, and risking the perceived possibility of burdening others.

When Be My Eyes launched nearly two years ago, a new tool was born: a radically different way to ask for help. Be My Eyes introduced blind smartphone users to a whole new type of social support network, one unbounded by geography, bureaucracy, or even practical limitations, that allowed blind users to get sighted assistance via video chat.

Today there are about half a million sighted volunteers with Be My Eyes loaded onto their phones, with more than 30,000 blind users on the other end. These volunteers will do anything from help you adjust the thermostat to spending half an hour helping you pick out an outfit for a high-stakes presentation. But at it’s core, each interaction is random, at-will and obligation free. The free app puts no limit on the number of calls you can make in a day. If you really wanted to, you could call 100 different people and have each of them identify the exact same piece of art – and the service, as always, would be free.

Even though thousands of blind people benefit from this app every week, the platform can handle thousands more. I wonder often if our notion of independent living so engrained, so hard-wired that we have still have trouble asking for help, even when there are really no strings attached.

Be My Eyes is working toward a gold-standard for people helping people. They have hundreds of thousands of hours of free labor, given with good faith, at a moments notice from people all around the world. It’s truly a new tool – like a fishing pole that reels in assistance whenever you want it. But as the old saying goes, you have to “teach a man to fish” before he can really benefit from the tools at hand.

Last month, I challenged myself to re-consider how I use the app. Occasionally I will be somewhere, alone, and realize that I am struggling. We all do this, sighted and blind alike: make things harder for ourselves then they need to be.

For one week, I told myself, any time I needed help I would pull out the app and give it a spin. What came out of it was surprising. Watch the video below to see Be My Eyes in action.

Not only did I use it for things I never thought it could work for – like identifying house numbers as I walked through a neighborhood or even the types of fish on my sushi plate – but I met people who were patient, not overbearing, and curious as to what they could do to be helpful without being obtrusive.

No one asked me personal questions, no one tried to coach me on how to live my life, and above all no one grabbed me by the arm and steered me somewhere I didn’t want to go. When I got what I needed, I could politely say thank you and hang up without fear that being brisk with someone would have repercussions later. It’s all the value of having someone nearby without any of the additional worry of initiating contact, explaining yourself, and ultimately breaking free of their of custody.

Our understanding of “independence” is not truly about total independence, but instead about masking the assembly line of helpers which make up our lives: the tiny little micro-transactions where individuals step in to provide assistance, whether or not we have a disability. For blind people, this is a more obvious reality than for most.

The reason Be My Eyes is so remarkable is because it embraces this reality wholesale: You can get the tiniest bit of help and move on through your life. The safety net is huge, and yet doesn’t loom over you.

Maybe it makes sense, then, that the guys behind Be My Eyes hail from Denmark, where you’re much more likely to hear about a more “social” approach. And if we think of human interaction as give and take, as an exchange of ideas or assistance as a true social interaction – maybe Be My Eyes has created the first truly social network.

HIMS Assistive Tech Demo Day Comes to LightHouse in October

HIMS Assistive Tech Demo Day Comes to LightHouse in October

HIMS has just announced its Demo Day at LightHouse for the Blind. Download the flyer here or read full text below:

Coming to San Francisco October 4, 2016!

Come learn about new advances in technology for low vision and blindness!

When: October 4, 2016 11:00 AM – 3:00 PM

Where: San Francisco Lighthouse for the Blind and Visually Impaired
1155 Market Street, 10th oor, San Francisco, CA 94103

What’s new at HIMS? Join Damian Pickering and Paul Stevenson for the latest braille and low vision product news. Stay for lunch on us. We welcome this opportunity to share our latest innovations. We would also love to hear your dream wishlist of features and products you’d like to see from HIMS.

Learn about and try Braille Notetakers, Braille Displays, DAISY Players/OCR Video Magnifiers/OCR and more

RSVP to Paul Stevenson by Monday, October 3rd by calling 888-520-4467 ext. 316 or emailing paul@hims-inc.com.

Meet DictationBridge: Hands-free Typing Sponsored by LightHouse for the Blind

Speech recognition and screen readers are both valuable tools for the blindness community, but what about technology that combines the two? Unfortunately, the current options are few, sometimes unstable and often expensive.

image of microphone with headphones on itThat’s why, when a group of notable blind technologists and power-users from around the country brought the idea for DictationBridge, to LightHouse Labs, our organization knew we had to help. The investment in DictationBridge, which represents the LightHouse’s expanding capability to invest in projects meaningful to the blindness community, will help ensure that the software is released into to the universe free-of-charge.

“We on the DictationBridge team are proud to have the Lighthouse for the Blind and Visually-Impaired on our team,” says Lucy Greco, assistive tech expert and spokesperson for DictationBridge, “We hope this is a first in what will become a series of projects like this moving into the future.”

As the highest level sponsor in DictationBridge’s Indiegogo campaign, which met its funding goal this week, the LightHouse is proud to help bring free hands-free typing to blind folks all around the world. If you’re still a little unclear about what DictationBridge actually does, DB’s website invites you to imagine a scenario:

“James is a blind entrepreneur but injures his hand and is unable to type. He knows he has to continue working. He has heard of speech-recognition and decides to try it. He has a little bit of vision so he uses ZoomText for magnification and speech. In the current scenario, he does not have a solution. DictationBridge is going to be a generic solution which will talk to ZoomText and WSR [Windows Screen Reader] or Dragon. Once James recovers, he may continue to use speech-recognition for productivity or he can resume a keyboard only way of working.”

That’s what we want for our community: to be able to keep working.

“The overwhelming majority of blind people worldwide cannot afford expensive and unstable solutions when they need to use dictation and a screen reader,” CEO Bryan Bashin said last week, “The Lighthouse believes it has a moral obligation to support the access needs of blind and visually-impaired people wherever they live. We applaud the creativity of the DictationBridge team to address this need and are happy to be part of their success.”

Happy typing, and check back for updates on DictationBridge’s public release.