> Castor told Apple reps how amazed she was by the iPad she received as a gift for her 17th birthday just a few years earlier. It raised her passion for tech to another level — mainly due to the iPad's immediate accessibility.
Just goes to show how different things can be between perception and reality. I've worried that the move away from keyboard-based inputs would increase the marginalization of disabled computer users. Now that I think (more) about it, the physical form of a tablet can be similarly as interpretable as a keyboard, with the added bonus that app designers don't have the choice to build (and prioritize) a mouse-based interface -- I'm assuming that mouse-driven interfaces are especially difficult for the visually-impaired [0].
The uniformity of interface that iOS imposes is probably especially useful for the visually-impaired, provided that they have employees (like the one featured in the OP) who are on the engineering and design teams.
A blind guy I know loves android significantly more than any other platform. To him he can just run his finger over the screen to "feel" what's there by listening to a voice that is WAY too fast for me to even really understand.
I've tried it myself (mostly to test out how some of our web apps work through that system) and it's actually pretty damn intuitive after you spend like 15 minutes getting the basics down.
Honestly I don't think that "textured" screens would help all that much, audio can just convey so much more information.
For example, when you have the android accessibility stuff on, when you scroll in any kind of list, it makes these tones. They start at one pitch at the top, and end at another pitch at the bottom. After a bit of use, you start to intuitively understand where you are in the list just by the pitch it's making.
Other things like vibrations for when you are hovering over important elements and more really lets you "see" what is there pretty well.
What would improve it is something 3d-touch-esque. Being able to have a light touch be "hover" and a hard press be "touch" would make it much easier than the current system (IIRC it's double-tapping for touch, and other things like 2-fingers for scrolling)
Years ago, I imagined a mouse with haptic feedback. You'd feel it tock when your pointer passed over a boundary. Different sensations for different boundaries and areas.
I didn't even think of some kind of braille like reading device. That'd be great.
iOS has excellent assistive technology. A good friend of mine who is blind (not totally, can see shadows) adores his iPhone as it allows him to go out independently with ease. He can text and use Maps to get around in places he has never been before. He has said to me on multiple occasions how the iPhone literally changed his life.
When I asked him to show me how he used it I was blown away at just how excellent the blind UX is in iOS.
I saw a GREAT conference talk earlier this year that discussed a number of reasons why phones and tablets are so valuable for visually-impaired users. It's definitely worth watching if anyone is interested in the topic and can spare ~30 minutes:
Just goes to show how different things can be between perception and reality. I've worried that the move away from keyboard-based inputs would increase the marginalization of disabled computer users. Now that I think (more) about it, the physical form of a tablet can be similarly as interpretable as a keyboard, with the added bonus that app designers don't have the choice to build (and prioritize) a mouse-based interface -- I'm assuming that mouse-driven interfaces are especially difficult for the visually-impaired [0].
The uniformity of interface that iOS imposes is probably especially useful for the visually-impaired, provided that they have employees (like the one featured in the OP) who are on the engineering and design teams.
[0] http://webaim.org/articles/visual/blind