Johan de Langen
Visually impaired people perceive the world radically different than others. Most of the cases they are not completely blind, but cannot make sense of what the eyes perceive. The human is a very visual being and receives a big chunk of information this way. The act of going outside and be part of traffic to get somewhere else mostly relies on visual feedback. Think how you get to your workplace or school. It gives us the ability to foresee and react to disturbances in the environment. There are tools to help them indeed. The seeing eye dog job is to perceive the environment for you and react on it. A white cane gives you a limited feedback of objects in front of you. The process to get a seeing eye dog can take more than a year and the training is expensive. Apps for the smart phone are helpful, but require some sort of vision.
What we envision is a wearable which combines the advantages of direct haptic feedback and perceiving the environment in a wider range. The wearable will provide the owner with a system that uses 3D camera technology, alike to the technology used in autonomous vehicles, to detect and warn the wearer of both potential obstacles and danger. The warning is done by intelligently vibrating the aWeare in the direction of the oncoming obstacle or danger. The aWeare will do this to intelligently guide the wearer to his or hers intended destination. You don’t have to scan the ground constantly with a cane or only rely on a guide dog. Furthermore, aWeare can cooperate with the existing alternatives together and creates potential for emergent phenomena.
This project is being coached by