|Supervisor:||Prof. Gudrun Klinker, Ph.D.|
|Advisor:||Daniel Dyrda, M.Sc.|
People who have vision impairment or blindness have difficulties with day to day tasks, such as navigating from on point to another or getting to know an unfamiliar environment. The aim of this bachelor’s thesis is to create an Augmented Reality system that supports people who are affected by these disabilities by giving acoustic feedback that is generated through different wayfinding support techniques.
To support the wayfinding process, a system that consists of a depth camera and a hand tracking device has been developed. The system has three different modes that are designed to support different wayfinding aspects. The first two modes are user-centered and directly convert visual stimuli into acoustic stimuli to help the user get a better image of his immediate surroundings. The third mode is a pathfinding mode which is used to guide the user along a safe path to his goal. In all of those modes sounds are used to convey information about distance and direction.
To answer the research question, a pre-study has been conducted that consists of various tests. The results indicate that sound can indeed be used to support the wayfinding process, but to get more definitive answers, a proper user-study has to be conducted.
The results of the pre-study strongly hint that sounds can indeed be used to signal distance and direction. While the study population already had an strong intuitive grasp in regards to direction, distances were more challenging, due to the use of sound volume as a sole indicator for distance.