This project is under the supervision of Prof. M.B. Hoffmann, located in the Visual Processing Lab at OVGU. The project aims to find tools to determine the effect of visual impairment on daily life locomotion and specifically navigation using VR paradigms.
Personal Background:
Personal Interest(s):
The real Vimmer, a chess player, and a competitive programmer
Aim of the project:
The objectives of my project are to 1) determine the effect of visual impairment on functional vision during daily locomotion, specifically on gait control, body movements and navigation, 2) use visually guided locomotion abilities to determine functional vision, and 3) establish links between locomotion abilities and functional vision capacities and relate these to quality of life (QoL).
Current activities:
To date, I have analyzed outcome measures such as travel time, pointing time, and distance error from locomotion data obtained from 14 glaucoma patients and 15 controls, using VR equipment and a navigation paradigm implemented in a virtual environment.
Furthermore, I have started my cross-sector secondment at the Pattern Recognition Company. In collaboration with ESR4 (Yaxin) I’m developing deep learning methods based to investigate differences and similarities between glaucoma and controls using the locomotion data from the first study.
Future directions:
As a future project, I’m planning to determine how locomotion abilities in visually impaired individuals relate to their quality of life by using machine learning techniques.
No output yet.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 955590
STARTING DATE: 01/03/2021
COMPLETION DATE: 28/02/2025