A major problem for visually impaired college students is independent campus navigation. Many universities, such as Utah State University (USU), have no Orientation and Mobility (O&M) instructors. Thus, visually impaired undergraduates must rely on their friends, siblings, and even parents to learn their way around a large campus, which reduces their sense of independence. This paper describes a wearable two-sensor O&M device for visually impaired USU undergraduates and presents a single subject feasibility test that estimates how a visually impaired navigator can use the device to learn new routes on the USU campus.
Visual impairment; blindness; assisted navigation; outdoor navigation; GPS; digital compass
Each year USU accepts a few visually impaired students [3]. The most difficult period for these individuals is the first semester when they do not know the campus and have to rely on sighted guides, e.g., siblings, friends, classmates, and even parents, to find their way around. The problem is recurrent in that a blind student must learn new routes every semester when the student takes a class that meets in a building he or she has never visited before. It is hoped that the proposed device will be available to the new visually impaired students through the Disabled Students Resource Center. The students will check out such a wayfinding device upon arrival, use it until they become comfortable with the campus, and return it back to the Center. It is important to note that, unlike most paradigms in assisted navigation, this paradigm does not attempt to create a device dependency. Instead, the objective is to create a device that is used only temporarily until its user achieves the required level of navigation independence in a given environment.
It is hypothesized by the investigators that a wearable system that consists of a small computational unit, a GPS receiver, a digital compass, a headphone, and a text-to-speech engine can enable a visually impaired navigator learn new routes independently. This research is inspired by the findings of the research group at the University of California at Santa Barbara headed by Loomis and Klatzky that has been doing basic and applied research on the Personal Guidance System (PGS), a navigation system for the visually impaired [1]. The main differences between the research presented in this paper and the PGS research are: 1) a novel GPS-based localization method, 2) addition of a digital compass to the suite of sensors, 3) exclusion of 3D audio for information delivery, and 4) focus on independent route learning
The current prototype, called WayFinder, (see Figure 1) is worn as a vest. The system consists of a GPS unit on one shoulder and a digital compass on the other shoulder. A computational unit sits in the front on the user's chest with an attached numeric keyboard which allows the user to enter commands and respond to prompts from the system. A PCMCIA wireless card can be inserted on the bottom of the computation unit although it is not currently used in outdoor environments. The system has headphones which allow the user to hear the system give commands and prompts. Figure 2 shows the hardware components of the WayFinder hardware platform in an acrylic container attached to the vest. Figure 3 shows the hardware architecture.
Figure 2. d Map of Routes (Click image for larger view)
In outdoor environments, the system relies on a modified use of GPS for localization in order to increase accuracy [2]. GPS data can also be used to infer directionality but it is not reliable due to signal drift caused by errors with GPS data. Thus, it can appear that the latitude and longitude of a user are changing even when the user is standing still. A digital compass overcomes this by providing reliable direction information. Currently, the compass is used to orient the user to the proper direction before they begin their route and then, as the user is moving along a route, it periodically reports the current direction to the user
When the user desires to go to a new destination, the user first enters the destination into the system through a small wearable keypad. When the user is ready to start, the system orients the user using the compass so that the user starts walking in the correct direction. As the user moves along the desired route the system periodically informs the user of his or her direction the user or tells the user what action needs to be taken: continue walking forward, turn left or turn right. When the user reaches the desired destination, the system gives final instructions such as how far away the door to the building is. The only time the user actually has to enter information into the system is at the beginning of a walk. The desired destination is chosen by navigating a voice-based directory of available destinations.
In order to see if the device can aid a person in learning new routes, four routes (see Figure 2) were chosen on the Utah State University campus. The routes were the paths a user would take to get from building to building. When combined, the four routes form a loop with the user ending up at the start position of route 1 when he finished route 4. The routes all kept the user on the sidewalk.
The test subject was a visually impaired USU graduate student who is a guide dog handler. Over each route the system correctly instructed the user when and which direction to turn, and periodically reported the compass direction to the user Upon arrival it gave final instructions on how to enter the building. After completing the loop of test routes once using the Wayfinder, which took approximately 15 minutes, the user was asked to repeat the loop with out the device.
The subject successfully completed all four routes. After the test was completed, the test subject stated that route 3 was the most difficult route, because it contained a long straight path. He noted that one of the buildings he passed by had a furnace which he could hear during the walk with the Wayfinder. On the walk without the Wayfinder, he used that furnace noise to time how long he had to walk before turning onto the sidewalk for the Ag Science destination. Although the test subject did not know the routes in advance, he was familiar with the area. A true test of the system will require test subjects who are totally unfamiliar with both the routes and the area. While the user really liked the functionality of the system and understood that it was just a test bed, he suggested that the system must be smaller and less conspicuous.
The study was funded, in part, by two Community University Research Initiative (CURI) grants from the State of Utah (2003-04 and 2004-05) and NSF Grant IIS-0346880. The authors would like to thank Mr. Sachin Pavithran, a visually impaired training and development specialist at the USU Center for Persons with Disabilities, for his feedback on the localization experiments.
This should be in the right column.