Project 10

PROJECT 10

Augmenting functional vision using automated tactile guidance

Status of position:   filled


ESR: Marcin Furtak

Why? Recent advances in computational object recognition, robotics and machine learning, combined with miniaturization of hardware, offer many opportunities for digital personal assistance. If there are self-driving cars soon, why not use similar camera-based approaches to assist humans in everyday tasks, like shopping and wayfinding? Especially for seniors and those with mild to severe impairments, technology can be the key for a self-determined partaking in society. 


How? Today, we have a tactile navigation belt for the blind that makes them feel orientation and navigation cues, utilizing vibration around the waist. Tomorrow, we want to give our customers a digital personal assistant that integrates environment cues, captured by cameras and IR-distance sensors, with machine learning, to offer the individual user situationally adapted assistance in real-time. For this, we will be using cutting edge approaches of deep-neural-network based object recognition, AI and robotics, combined with tactile displays as the user interface. The digital assistant will be optimized for seamless integration, and to support the user, according to their individual needs. This includes, for example, various everyday tasks like shopping in the local supermarket. 


Where? Company feelSpace GmbH (Osnabrück, Germany) runs the project in combination with the University of Osnabrück. feelSpace GmbH is a high-tech start-up company, known for its sensory augmentation technology. It developed the wearable "naviBelt", the first tactile navigation device for visually impaired people. The project will be supervised by Silke Kärcher (CEO feelSpace) and Prof. Dr. Peter König of the Cognitive Science department from the University of Osnabrück. University of Osnabrück offers many exciting courses to all related topics, international researchers with a wide variety of experience and a lively lab atmosphere.


What can you expect to learn and experience? You can do your PhD in an environment that is bridging the gap between theoretical and applied science. The combination of feelSpace GmbH/University of Osnabrück offers the rare chance to collect valuable working experience in a highly innovative company where you are part of a motivated and curious team, while at the same time having all benefits of academic study available to you from the excellent Cognitive Science program of the University of Osnabrück.


Who are we looking for? We are looking for a candidate who has deep knowledge in the topics of object recognition, machine learning and robotics, and has hands-on experience. Strong programming skills (esp. Java, Android, C, C++) are an essential requirement. We are looking for applicants with self-dependent, goal-oriented and self-motivated working habits. We are also open to accommodate your family obligations with flexible working arrangements.  


References

  • Kärcher SM, Fenzlaff S, Hartmann D, Nagel SK and König P (2012). Sensory augmentation for the blind. DOI: 10.3389/fnhum.2012.00037. Front Hum Neurosci 6:37
  • Gulde T, Kärcher S and Curio C. Vision-based SLAM navigation for vibrotactile human-centred indoor guidance. ECCV2016
  • Hernández-García A and König P (2018). Further advantages of data augmentation on convolutional neural networks. International Conference on Artificial Neural Networks (ICANN); Preprint of the manuscript: arXiv:1906.11052;
  • Hernández-García A and König P (2019). Learning representational invariance instead of categorization. International Conference on Computer Vision (ICCV), Workshop on pre-registration in Computer Vision

You can apply for this position until September 10th, 2022, 6pm (GMT+1).

Project output


No output yet.

Share by: