In the worldwide population of 7 billion, there are 285 million visually impaired people. ( WHO 2014 ) A critical behavior in their daily life is sound recording, a way to keep their memory. They record what they'd like to keep. The key value of the behavior is immediacy, while normal devices or even mobile applications have limitations for them to reach the importance of immediacy.
In recent years many systems on mobile phones have been equipped with refined screen readers. While when a screen reader is on, a user must take action one more time than in visual selection. Combined their reliability of sound & mobility of their recording behavior, we've come up with a gesture-based solution on mobile device, to improve their recording experience. We've flattened the utilization, reducing these barriers, such as removing unnecessary text input, directly sharing function, and lessening the layers of actions.
We were dedicated to create the hearing and touch-based interaction, making the gestures correspond with the functionality. For instance, by the two-finger double tap on the screen, users could immediately start recording.
RESEARCH AND INSIGHT
We went through a user-experience innovative design process and finally presented the HearMe project. At the beginning, user experience research (UXR) of understanding the current behavior of the visually impaired disclosed the need for a better audio experience.
DESIGN AND FEATURES
By using double fingers double taps to capture the sound immediately, just like when people press the shutter to take pictures, the user could instantly initiate recording timely. Also the system provides decent voice feedbacks to emphasize the key actions.
Recordings of classes and meetings could last for hours. Tag timestamps by transversely right swiping on the touch screen when recording is on. Relatively when listening to the sound, using the same gesture could guide the user to the timestamps tagged before.
A User is able to share the record with a brief voice message attached forward a shared one as a greeting or a subject of the sharing. The concern help the user avoid typing troubles.
When traveling, a user could also locate where the user is in order to search nearby sounds, such as historical stories, tour spot introductions and recommended choices shared by the others. Consequently the user will know more about the certain place through the auditory resources.
The UXT has successfully shown HearMe has brought a greater operational independence to the visually impaired, which is also reducing the mistakes and the burdens of the usability compared to the past. The holistic UX design process, starting from two times of user researches and design proposals to the iterative development, and finally conducted two user experience testings in order to validate again and again and improve our design. Such holistic approach helped us to better understand our users and the design itself.
The visually impaired expressed their appreciation because we've come out a product that they wanted and needed. HearMe has brought a greater operational independence to the visually impaired, also reducing the mistakes and the burdens of the usability compared to the past.