As a communication specialist for Knowbility and a journalism lecturer at Cal State Long Beach, accessibility is not a nice-to-have. Without screen readers that allow me to access information, it’s hard to imagine that I’d hold the professional roles I have today. Yet the end-user experience is an often-neglected part of the development lifecycle, which means too much web content isn’t accessible to screen reader users.
In January, I, along with Texas State University computer science student Su Park, demonstrated screen reader interactions as part of Knowbility’s Screen Readers in the Wild webinar. Both Su and I are blind and have used screen readers since childhood. We weren’t able to answer all attendee questions during the webinar, so I’m continuing the discussion here. I hope that after reading this, you’ll better understand the roles that screen readers and accessible content play in the lives of blind people, and learn of a few easy ways you can improve the user experience. So, let’s jump right in!
Enhancing the Experience with Alt Text
The alt attribute in HTML lets you write a description to images on the web. Also known as alternative text or “alt text,” these short descriptions are part of the Web Content Accessibility Guidelines (WCAG) and help blind and low vision people access visual information.
In general, a short description — something that would be said in a sentence — works fine. But it depends on context. Some images are complex and need much more description, while others are decorative and don’t need alt text at all (also known as “empty alt”).
Often, developers go too far treating images as decorative. When JAWS or VoiceOver find no images, or only a few with meaningful alt text, I know I’m probably missing out on something. Sometimes I can guess the context of an image.
For example, take the alt text “4741527001 ‘Ted Lasso’ used mix of CGI and acting to bring soccer to life NEWS 5:51 p.m. CT Mar. 17” from the Austin American-Statesman. I assume that Ted Lasso and soccer are involved, but there’s no more detail.
On the other hand, I find that The Atlantic has been leveling-up its alt text skills. I’ve found a mix of short and long descriptions, both of which have been helpful in understanding the image and its purpose. An example I recently found on the homepage is “a grasshopper waits on a vaccine syringe,” next to a headline about second vaccine doses. Though some may deem this a decorative image deserving of an empty alt, I disagree. I appreciate getting this much information by just browsing a homepage. It feels like I’m getting the same information a sighted reader would get after skimming the headlines and their corresponding photographs.
Learn More About Good Alt Text!
Creating accessible images is both a science and an art. The science reveals when you need alternative text and the techniques to use. The art is crafting alt text that helps screen reader users connect and engage. Join us April 19 for Images: Alternative Content for Accessibility and learn the basics, best practices & advanced techniques for creating accessible images. Register Now
Tabbing vs. Arrow Key Navigation
Screen readers on desktop and laptop setups work through keyboard-only interactions. But people may wonder how exactly this works. Is it all about tabbing through content or all about single-letter shortcut keys?
When I began learning JAWS version 3.2 in 1999, I used the arrow keys to navigate within Windows and in Microsoft Word. The Tab key was necessary to navigate web pages. After the virtual buffer was introduced in JAWS version 3.31, I could use the arrow keys to interact with the web. That said, accessibility testing with Tab is still necessary to discover barriers for everyone, blind or not, who uses the keyboard alone.
Su’s experience is similar to mine. “When I began learning screen readers in general, I was taught tabbing and shortcut keys first, but was also told that arrow keys, while certainly not a fast way to navigate, could reveal important information when using something I was unfamiliar with,” she said during our webinar.
Watch Screen Readers In Action!
If you missed Screen Readers in the Wild, you can still watch the demos!
Using JAWS to find a video on YouTube
Using NVDA to find stock information on Google Search
Using NVDA to customize an order on See’s Candies’ website
Using VoiceOver to order from Coffee Bean iPhone app
Favorite Screen Readers and Braille Displays
JAWS for Windows has been my screen reader of choice for more than 20 years. It’s the first one I learned, I’m familiar with the keyboard commands, and it’s the most robust across the apps I use. I’ve used VoiceOver on iOS since 2013. I’m very satisfied with VoiceOver on iOS.
“I switched over from JAWS to NVDA but the learning curve for me wasn’t nearly as steep,” Su said, adding: “I also use VoiceOver on my iPhone, synced with a Braille display.”
Refreshable Braille displays are quite the tech marvel. These devices use tiny pins that are quickly raised or lowered electronically to form the shape of Braille symbols. They usually include a Perkins-style Braille or QWERTY keyboard that lets the user interact with the device and remotely control a desktop or mobile device.
Su uses a BrailleSense Polaris Braille notetaker. Notetakers are like the PDAs of decades past with Braille displays built in that also pair with a laptop or mobile device. I occasionally use a Focus 40 Blue, 5th Generation display with both my iPhone and Windows Laptop.
Conclusion: Screen Readers, Plus Accessible Design, Improve the World
Whether learning about keyboard navigation, user preferences of screen readers and Braille displays, or why to use alt text, creating accessible experiences for blind and low vision people is a worthwhile journey. Making a few simple changes — like using semantic HTML or using alt text that conveys meaning — will go a long way in improving the lives of your end users and perhaps even your coworkers.