Scientists from IBM Research and Carnegie Mellon University (CMU) have created an open platform, which will help in creating apps to help blind people better navigate their surroundings.
The open platform was used to create a pilot app, dubbed NavCog, which analyses signals from Bluetooth beacons situated along walkways and from smartphone sensors.
It will help users to move without human support, whether inside campus buildings or outdoors.
The app guides individuals by whispering into their ears through the use of earbuds, or by creating vibrations on their smartphones.
Researchers are exploring more capabilities for future versions of the app for identifying the persons who are being approached and their mood.
IBM has made the first set of cognitive assistance tools available on the cloud through its Bluemix platform, with the open toolkit made up of an app for navigation, a map editing tool, and localisation algorithms.
Researchers are planning to add several localisation technologies, including sensor fusion, which integrates data from various environmental sensors for highly sophisticated cognitive functioning like facial recognition in public places.
The use of computer vision is also being explored to characterise the activities of people in the vicinity and ultrasonic technology to help detect locations more accurately.