Building Bridges Through Access: The Intel Reader’s Cross-Disability Journey

By Darryl Adams

Introduction

Before smartphones had cloud-based OCR apps, there was the Intel Reader, a purpose-built device that could capture and read printed text aloud to users, transforming access for blind and visually impaired people. But its origin wasn’t in a corporate market analysis or product roadmap. It began with one person, Ben Foss, an Intel employee with dyslexia, building a tool to help himself read. What followed became a defining experience in how assistive technology should be created: not just with innovation, but with inclusion.

Context and Origin

In the mid-2000s, OCR technology was still largely confined to flatbed scanners tethered to desktop computers. For someone who couldn’t easily read printed text, whether due to blindness, low vision, or dyslexia, mobility was a major barrier. Ben Foss, who struggled to read due to dyslexia, envisioned a portable device that could take a photo of text and read it aloud on the spot.

As the idea took shape at Intel, Ben began working with a small internal team to explore prototypes. That’s where I joined the process. At the time, I was managing R&D projects at Intel but was not yet deeply involved in accessibility work. Although, I was able to provide personal insights about my own access needs and the challenges I experienced with hardware and software interfaces. We used a styrofoam prototype to rapidly iterate hardware form factors and tactile controls that would be intuitive and usable by blind users.

Co-Design in Practice

From the outset, we knew that designing for the blind community meant involving blind people directly. We consulted with users and organizations familiar with the needs of blind and visually impaired individuals. They gave candid feedback that changed the product’s trajectory. For instance:

  • Buttons were made large and tactile, with distinct shapes and textures.
  • The audio feedback was made non-intrusive but informative.
  • Reading modes allowed flexibility for both continuous text and segmented page navigation.
  • The form factor was balanced to be comfortable in one hand, with real-world grip testing.

Co-design wasn’t a single phase, it was embedded throughout the product lifecycle. We learned that some of our assumptions as engineers and product designers didn’t hold up under real-world use. Blind users prioritized durability and predictability over sleekness or novelty.

Impact and Limitations

When launched, the Intel Reader was groundbreaking. It offered blind and low vision users the freedom to read printed material anywhere, books, menus, handouts, signs. For many, it was the first time they could independently access printed information on the go.

We partnered with HumanWare to bring the product to market, tapping into their expertise and relationships in the assistive technology community. The device earned positive attention, and for some users, it became an indispensable part of daily life.

But it wasn’t perfect. The device was expensive, and like many specialized hardware products, it lacked a long-term software update model. As smartphone platforms improved and apps like KNFB Reader emerged, the proprietary nature of the Intel Reader became a limitation.

Lessons for Future Innovators

The Intel Reader taught us more than just what to build, it taught us how to build. Key lessons include:

  • Start with lived experience: Ben’s personal connection to the problem drove authenticity and urgency.
  • Iterate with the community: Blind users shaped everything from button placement to audio output.
  • Design for dignity: Simplicity and effectiveness matter more than flash.
  • One voice can spark change: Even in a global tech company, one person’s advocacy can launch a movement.

Today, innovation in assistive technology are increasingly happening in open ecosystems, with software platforms and AI-driven agents replacing proprietary hardware. But the lessons from the Intel Reader remain vital.

From Hardware to Platforms

What once required a dedicated device is now handled by apps, smart glasses, or cloud-based agents. Tools like Envision Glasses or Seeing AI show how multimodal inputs, camera, voice, gesture, can provide richer interactions. AI systems now adapt in real time, learning from user preferences and improving over time.

Still, the shift from hardware to platforms only underscores the importance of designing inclusively. Whether building software, agents, or interfaces, the design process must start with the people who will use the product in the real world.

Conclusion

The Intel Reader represents more than a technological milestone, it’s a case study in designing with empathy, intention, and community collaboration. I’m proud to have contributed to its early development and to have witnessed how one individual’s lived experience, paired with corporate support and community input, can lead to meaningful change.

For today’s technologists, the call to action is clear: build with, not for. When we center the voices of people with disabilities from the start, we don’t just make better assistive tech, we make better technology for everyone.

Ready to build beyond compliance?

Contact Access Insights to learn how we can help you create technology that empowers everyone.

Contact Us