I need to support Voice Over (for blind/partial blind persons) for my application. Please guide me开发者_运维问答 how to implement this feature.
This is the Accessibility Programming Guide for iOS .
I think this question is very vague with many aspects, so I'll try to touch a few:
Technically, if you can abstract each string from the interface, this can help: http://www.acapela-for-iphone.com/. From the top of my head, I can think of saying out loud the positions of the interface elements. This approach requires simple interfaces, with few elements on 1 screen and maybe multiple screens for subsequent actions.
The best answer I've seen so far is this article which points to a Stanford lecture by Apple Engineer, Chris Fleisach
http://www.podfeet.com/blog/tutorials-5/build-accessible-ios-apps/
Chris's video: http://www.youtube.com/watch?v=5b0V6MltEnw
The video has what we all are looking for.
In the video starting around the 15:30 sec mark Chris starts talking about the relevant information. The API discussion starts at 19:29 sec.
I suggest to take a look at :
- The design criteria to be well known before you code.
- The developers guide to get some info about the way to code essential features.
- The VoiceOver gestures if you need to know more than just the basic ones.
- Some WWDC videos that are summarized and perfectly detailed with video timelapses to get directly the appropriate playback.
Many illustrations help to understand the concepts and the code snippets are provided in ObjC and Swift.
精彩评论