SAFA ANDAÇ

Assessment of functional vision for daily life locomotion and navigation using VR

Update Nov. 2024


This project is under the supervision of Prof. M.B. Hoffmann, located in the Visual Processing Lab at OVGU. The project aims to find tools to determine the effect of visual impairment on daily life locomotion and specifically navigation using VR paradigms. 


Personal Background:

  • BS in Computer Engineering - MA in Cognitive Science
  • Data Structure and Algorithms
  • Advanced in C, C++17, C#, Python, Unity and MATLAB
  • General Knowledge of Programming Languages
  • Design of Experimental Paradigms


Personal Interest(s):

The real Vimmer, a chess player, and a competitive programmer

 

Aim of the project:

The objectives of my project are to 1) determine the effect of visual impairment on functional vision during daily locomotion, specifically on gait control, body movements and navigation, 2) use visually guided locomotion abilities to determine functional vision, 3) establish links between locomotion abilities and functional vision capacities and relate these to quality of life (QoL), and 4) understand the gaze behavior of those visually impaired people during navigation in an unfamiliar environment using VR. 


Current activities:

I have been looking at how people with visual impairments  move their eyes while exploring a new place in virtual reality (VR). I used eye-tracking to see where people look and how they explore the space in collaboration with Peter König and his lab.


I also used VR to simulate vision problems called scotomas, which can block part of a person’s view—either in the center or the sides. I wanted to see how these vision problems affect how people look around, especially when there is no adaptation.


Furthermore,  in collaboration with Yaxin (ESR4 from Pattern Recognition Company), we applied deep learning methods to investigate differences and similarities between glaucoma and controls using video clips from age-matched control and glaucoma groups walking on a treadmill. Yaxin and I are finalizing the draft to submit.


Future directions:

As a future project, I’m planning to determine how gaze behavior abilities in visually impaired individuals relate to their quality of life by using machine learning techniques. Next, I want to study how people with vision problems like glaucoma explore their surroundings. I’m especially interested in the different ways they might look around or change their behavior to adapt.


My OptiVisT experience:

Being part of the OptiVisT project was one of the most valuable parts of my PhD. It helped me grow as a researcher, improve my communication and teamwork skills, and approach problems from different perspectives. Collaborating with people from various backgrounds gave me a broader view of my research and made the experience more meaningful. Most importantly, I had the chance to work with supportive and dedicated colleagues, which made the overall journey both productive and enjoyable.

Project output

I had the chance to present my research at two important conferences: the European Conference on Vision Perception (ECVP) in 2024 and the German Society of Ophthalmology (DOG) in 2023. 



We published our path integration study in virtual reality in Scientific Reports.

Contact

Interested in my work and want to get in touch? Send me an e-mail or follow me on Github