RESNA Annual Conference - 2020

Design Of A Non-Visual Interface For Route Planning For People With Vision Loss

Zahra Alizadeh Elizei1, Jon A. Sanford2

1Student of Master of Industrial Design at Georgia Institute of Technology, GA., Atlanta

2Professor at college of Design, Georgia Institute of Technology, GA., Atlanta

INTRODUCTION

Navigation and way-finding process can be defined as a combination of orientation and turn by turn direction. Orientation is more about understanding of the surroundings, while turn by turn navigation involves how to get from one point to an end point whether that be a restaurant, convenience store, doctor’s office or supermarket. In a navigation task, vision provides the pedestrian with landmarks and dynamic cues that are essential for updating position and orientation, as well as estimating distance to an endpoint [2]. However, Independent navigation in unfamiliar and complex environments is a major challenge for visually impaired people [1].

Two different processes of human mobility have been identified for navigation assistive system design: sensing of the immediate environment and orientation during travel [3]. While the former refers to the gathering of spatial information for obstacle detection, the latter involves the update of the traveler’s location in a route and the continuous guidance to reach a destination [4]. To explain more precisely, the process of wayfinding is divided into four tasks: 1) Orienting oneself in the environment, 2) choosing the route, 3) keeping on track, and 4) recognizing that the destination has been reached [8].

Navigation systems can be distinguished into two categories, "outdoor" and "indoor" navigation. Most outdoor navigation assistive systems use global positioning systems (GPS) for the traveler’s localization and have made outdoor navigation much easier for blind pedestrians. The main drawback of GPS is that satellite signals become significantly weaker in indoor environments as buildings block the line of sight between the satellite and the receiver [5]. Here, in this research, the main focus is on outdoor navigation system. 

PROBLEM STATEMENT

Visually impaired person, in the absence of vision, should rely on other modalities for their way finding.

Many research and systems have been developed to enable blind and low-vision people to navigate more independently. These systems use such output as speech, tactons, and haptics to aid blind people navigation. While most of these systems primarily rely on speech output, this can be distracting and difficult to hear in a loud environment. Since blind and low-vision people depend on their hearing to understand their surroundings, speech output can also be unsafe [6].

The goal of this research is to design an effective system to help blind people navigate independently. Considering Universal Design principles in the design of new system in order to be applicable for both sighted and visually impaired people, is of other aims of this research.

RELATED WORK

There is a multidisciplinary effort and a great amount of research trying to understand and support the orientation and mobility of blind people in areas such as human-computer interaction, accessible computing, cognitive science, computer vision, and ubiquitous computing [1]. These efforts created a variety of solutions that help blind people navigate in the real-world and acquire spatial knowledge.

The solutions that provide support to blind travelers include turn-by-turn navigation to help users reach a destination. Several researchers have been tackling this challenge with different approaches and from different perspectives. This includes, for instance, efforts to assess the requirements and information needs for blind navigation [8, 9]; the design and evaluation of interfaces to guide the user; the study or modeling of user behavior during navigation assistance [10, 11]; the investigation of the factors that influence user acceptance of such systems [1]; Still, most solutions present static interfaces that are not able to adapt to users nor different situations.

Most commercial and research wayfinding tools that provide nonvisual feedback use speech, but there has been some work on the use of haptic feedback for both blind and sighted users [6]. For instance, the Strider (1994), the Atlas (1995), the GPS-talk (2000), and the BrailleNote (2001), all Sendero Group products provided verbal instructions for traveling to desired destinations [9].

Pielot and Boll [8] and Heuten et al. [6] used a tactile belt to convey directional feedback to users with visual impairments. Amemiya and Sugiyama [1] developed new vibrotactile feedback modalities to convey directions. Kostopoulos introduced a framework of map image analysis and presentation of the semantic information to blind users using alternative modalities (i.e. haptics and audio). The proposed framework utilized novel algorithms for the segmentation of the map images using morphological filters to convert the visual information of the street names into audio messages [2].

Below, a list of navigation apps and systems, using for visually impaired people are collected in Table 1, the data has been gathered from both commercial and research case studies.

Table 1.  A list of navigation apps and systems, for visually impaired people
Mechanisms App/System
Speech 1. VoiceOver / i-phone
Speech 2. TalkBack / Android
Voiceover/ Vibration           
                                                                               
3. Ariadane GPS/iOS Devices
Provide haptic navigation 4. Ariadne Headband
self-voicing / Filter Announcements screen
Text through MBraille into any text field
5. BlindSquare/ i-phone
Live video connection, allowing blind users to call to request assistance from a sighted person. 6. Be My Eyes /Android /i-phone
Small wearable device
Vibrations
7. Buzzclip
Navigational guidance automatically speaks
Ask about current location, the distance to the nearest intersection
A notification alarm
8. GetThere / Android
VoiceOver
Provide text-based information about surroundings along with options for customizing what to hear as travel.
9. Nearby Explorer Android /i-phone
VoiceOver 10. Sendero GPS LookAround/ i-phone
Voice notifications/warnings 11. Lazarillo/ Android /
 i-phone
Haptics 12. Tactors on Wrists
Navigational directions are encoded as vibrations & conveyed to the user via a tactile display that inserts into the shoe. 13. GPS & Tactile Foot                                    Feedback
The proposed framework utilizes novel algorithms for the segmentation of the map images, using alternative modalities (i.e. haptics & audio) . 14. A framework of map image analysis & semantic information
Explore map through vibro-tactile and speech feedback. 15. TouchOverMap
New ultrasound technology developed by Ultrahaptics, enables to create tactical feeling without touching physical objects. 16. Intuitive haptic interface
Natural sense of vision through acoustics & haptics. 17. Sounds of Vision
A simple map-based navigation system, using set of Tactons. 18. Pocket Navigator
Using vibration on a smartphone to provide turn-by-turn walking instructions. 19. Smartphone Haptic Feedback
A belt with vibrators that indicates directions. 20. Non-visual support system for wayfinding
Enabling the user to distinguish free space from obstacles. 21. ALVU (Array of Lidars & Vibrotactile Units.)
Turn by turn directions using Haptic language 22. Way Band
Using radar and augmented reality, it enables blind to travel with confidence. 23. Sunu Smartband

DESIGN PROCESS

Design Goals

  1. Design a non-visual interface for route planning for people with vision loss
  2. Considering Universal Design guidelines into design
  3. Use different modalities to meet all users’ needs and create an optimal experience for all users regardless of their ability.

Design Criteria

Designing a system (mobile interface) or a device or a combination of both to be applicable for both sighted and visually impaired users.

The blind user uses the designed system without feeling segregated or stigmatized because of differences in personal capabilities.

  1. The first concept is demonstrated through hand sketching. The design consists of Mobile Application and a Wearable Mobile Watch. This can provide users with Voice modality & Haptics on wrist to guide them to navigate effectively – based on Universal Design Principles.
    Figure 1. Concept 1 : Design of Mobile App & Mobile Watch as a complementary wearable device, Voice modality & Haptics on wrist – Universal design.
    The design output would assist the user for orientation & (turn by turn) wayfinding.
  2. The design output could have any kind of modalities, voice, haptics, or combination of methods.
  3. The design exchanges perceptible information to target users (both sighted & visually impaired people.); the product can be used without sight.
  4. The design should be simple and have intuitive use. (Low cognitive loads for blind people.)
  5. The design should be flexible in use. In other words, every potential user (sighted/visually impaired) can find at least one way to use this product effectively. Also, this product facilitates user accuracy and precision. 
  6. "Tolerance for error", the design should be in a manner, through which if the user makes a mistake, it won't cause damage. It also should prompt the user to pay attention during critical tasks. In addition, the system is required to draw the user's attention to errors or hazards.

Design Concepts

 The second concept is represented. The design is considered only as one device, a Mobile Application and Additional Vibration Motor on the Phone. The Mobile App provides users with the Voice output and additional vibration motor produces distinctive & recognizable Haptics to support users to navigate independently - based on Universal Design Principles.
Figure 2. Concept 2: Design of Mobile App, using additional vibration motor on the phone Output modalities: "Voice" & "distinctive & recognizable Haptics" on mobile phone Universal design.

Through investigation of the related research and available case studies, two concepts were developed.

DISCUSSION

In design of a navigation system for visually impaired people, other modalities than vision should be applied. Spoken  instructions  also could  have  a  high  level  of intrusion  and  may  not  be  received  in  noisy  environments. Therefore, a combination of haptic and voice can help blind users find their way effectively.

To meet the needs of blind users, it is decided to use a combination of different modalities for the system outputs; resulting to haptic-audio representation of the map which help visually impaired person navigate safely. Voice can be used to inform user about the orientation information of the environment and haptics to show turn-by-turn wayfinding instructions.

CONCLUSION

This research proposes a non-visual interface for route planning for people with vision loss. In the first phase of the research, the related literature review articles and commercial case studies are thoroughly investigated. Then, based on the research findings and users' needs, a navigation system   for visually impaired people is presented. To consider Universal design principles into design, it is decided to only use a mobile smart phone as the main device to provide blind and visually impaired people with orientation information and turn-by-turn walking instructions.  Through the proposed audio-haptic framework, the combination of different modalities can assist the blind users navigate independently. The orientation information can be received through the voice, while haptic outputs provide the users with turn by turn navigation. Also, the ultrasonic sensors can be used for obstacle detection which is found in front of the visually impaired person.

Through the next steps of the research, the design would be developed and tested by visually impaired users.

REFERENCES

  1. Guerreiro J., et al, "Hacking Blind Navigation", CHI Conference on Human Factors in Computing  Systems, 2019.
  2. Kammoun S., Jouffrais C., "Guiding Blind People with Haptic Feedback", Conference: Pervasive Workshop on Frontiers in Accessibility for Pervasive Computing, January 2012.
  3. Loomis, J.; Golledge, R.; Klatzky, R.; Speigle, J.; Tietz, J. Personal Guidance System for the Visually Impaired. In Proceedings of the Annual ACM Conference on Assistive Technologies, October 1994.
  4. Velázquez R., et al, An Outdoor Navigation System for Blind Pedestrians, Using GPS and Tactile-Foot Feedback, Applied sciences, April 2018.
  5. Vlaminck M., Hiep Q., Hoang V., et al, "Indoor Assistance for Visually Impaired People Using a RGB-D,Camera. In Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation, March 2016, (161–164).
  6. Azenkot, et al, "Smartphone Haptic Feedback for Nonvisual Wayfinding", The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, October 2011.
  7. Wilko Heuten, et al, "Tactile way finder: A non-visual support system for wayfinding", January 2008.
  8. Nikola Banovic N., Franz R., Truong K., Mankof J., et al, "Uncovering information needs for independent spatial learning for users who are visually impaired." In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, October 2013.
  9. Nicolau H., Jorge J., Guerreiro t., "Blobby: How to guide a blind person. In Proceedings of the 27th International Conference on Human Factors in Computing Systems, January 2009.
  10. Sato D., Oh U., Naito K.,Takagi H., et al., "NavCog3: An evaluation of a smartphone-based blind indoor navigation assistant with Semantic Features in a Large-Scale Environment". In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, (270-279), November 2017.
  11. Stein T., Seeger M., et al, "Design Recommendations for Tactons in Touchscreen interaction", IADIS   International Journal, Vol. 15, No. 2, 2018.