Intel RealSense Spatial Awareness Wearable (SAW)
The Spatial Awareness Wearable is a collection of prototypes utilizing RealSense depth cameras to augment spatial awareness for visual impairments. The system is a sort of meshwork of technologies worn on the body of the user that provides depth-sensing capabilities paired with computer vision techniques and haptic feedback.
We began with a very constrained and focused target: assist users with acute visual impairment to navigate a cluttered space. To realize this, SAW used an ultrabook in a backpack wired via an Arduino to eight vibration motors, all tentatively held in place with straps and electrical tape. This was refined to a backpack contained unit with hardwired 3D-printed feedback boxes with an adapted undergarment for Intel CEO Brian Krzanich’s 2015 CES keynote.
The prototype passed through a phase where this was adapted to using the same laptop in a backpack approach communicating over WiFi (via a battery-powered WiFi router) to Particle Photons. The most current iteration (when this was posted) employs Intel’s Joule for it’s core compute, RealSense ZR-300 or R200 for it’s hardware depth-sensing, librealsense and OpenCV for its computer vision algorithms, and eight Curie-based tinyTILEs to handle Bluetooth wireless communication with the core compute system and driving the vibration motors used in providing haptic feedback to the wearer.
This project has been continuously informed via informal user testing. This revealed remarkable differences amongst users. Our testers’ visual impairments ranged from being blind for most/all of their lives to a number of users with Retinitis Pigmentosa (RP) who demonstrated remarkable variance in their visual acuity and adapted behavior. Each shaped the project with their input, insight, observed behavior, and suggestions. Even the project conception evolved from one of obstacle avoidance to one of spatial awareness. We knew that Intel had no plans to productize the project and Brian gave us the mandate to share our code and designs publicly to enable researchers and developers to use our project as a jump-off for their own work. The original team is no longer working together on the project, though we each follow and contribute at times to assist with its ongoing evolution beyond its days in our lab.
Now for some visual history of the project:
- Technology Brings Spatial Awareness to People with Vision Loss
- Assistive Technology for Visually Impaired Uses Intel 3D Cameras
- Intel says 3D camera, designed for laptops and tablets, may also help the blind
- A Wearable That Can Help Blind People Understand Their Surrounding
- #MadeWithCinder: Intel RealSense at CES 2015
- Low vision device using Intel RealSense 3D camera technology