Open Science
Below are links to the code and data that are currently publicly available. This page will be updated with additional resources as they become available. Furthermore, the OptiVisT researchers have indicated their willingness to share code and data upon reasonable request.
Code
All original Python codes to visualize and analyze the data of Project 3 have been deposited at Open Science Framework (OSF) and are publicly available
- Analysing gaze behaviour data during stair climbing in uncontrolled and familiar setting
- Analysing gaze behaviour data during stair climbing when carrying a tray
- Analysing gaze behaviour data during stair climbing with varying task instructions and expectations
- Analysing gaze behaviour data of blind people during stair climbing with cane
Code associated with Project 4 are publicly available on github: https://github.com/kaka761
Code associated with Project 5 are publicly available on github: https://github.com/LEO-UMCG/Non-overlapping-visual-field-defect-psychophysical-experiments
Code associated with Project 10 are publicly available on github:
Code associated with Project 12 are publicly available on github:
- https://github.com/WardNieboer/Tennis-gaze-analysis
- https://github.com/WardNieboer/tennis-ball-click-tool
- https://github.com/WardNieboer/Eye-tracking_test-battery
Code associated with Project 13 are publicly available on github:
- https://github.com/arnejad/E2E-Point-SPV
- https://github.com/arnejad/ACE-DNV
- https://github.com/LEO-UMCG/SPV-Gaze-Contingency
- https://github.com/LEO-UMCG/Non-overlapping-visual-field-defect-psychophysical-experiments
Data
Analysed data of Project 3 have been deposited at Open Science Framework (OSF) and are publicly available:
- Gaze behaviour data during stair climbing in uncontrolled and familiar setting
- Gaze behaviour data during stair climbing when carrying a tray
- Gaze behaviour data during stair climbing with varying task instructions and expectations
- Gaze behaviour data of blind people during stair climbing with cane
The datsets used by Project 4 are publicly available from an external resource:
- https://www.crcv.ucf.edu/research/data-sets/ucf101/
- https://serre-lab.clps.brown.edu/resource/hmdb-a-large-human-motion-database/
- https://www.kaggle.com/datasets/dcsyanwq/weizmann-dataset
- https://www.csc.kth.se/cvap/actions/
- https://labicvl.github.io/ges_db.htm
- https://gibranbenitez.github.io/IPN_Hand/
Data collected for the game design "V-Spy Scotoma" of Project 6 are shared on zenodo: https://zenodo.org/records/10829367
Data associated to Project 10 have been deposited at OSF and are publicly available:
- Helping Blind People Grasp: Evaluating a Tactile Bracelet for Remotely Guiding Grasping Movements
- Guiding the hand to an invisible target
- Helping Blind People Grasp: Enhancing a Tactile Bracelet with an Automated Hand Navigation System
Eye-movement data of glaucoma patients with asymmetrical visual field loss during free viewing associated to Project 11 have been shared on Zenodo: https://zenodo.org/records/7761477
Other
ESR7 shared all his presentation materials in slide: https://slides.com/safaandac
ESR13 shared papers, posters and presentation slides on his personal webpage: www.nejad.info