Join us on Wednesday, December 1 at noon Pacific for the Spatial AI and CV Hack Chat with Erik Kokalj!
A lot of what we take for granted these days existed only in the realm of science fiction not all that long ago. And perhaps nowhere is this more true than in the field of machine vision. The little bounding box that pops up around everyone’s face when you go to take a picture with your cell phone is a perfect example; it seems so trivial now, but just think about what’s involved in putting that little yellow box on the screen, and how it would not have been plausible just 20 years ago.
Perhaps even more exciting than the development of computer vision systems is their accessibility to anyone, as well as their move into the third dimension. No longer confined to flat images, spatial AI and CV systems seek to extract information from the position of objects relative to others in the scene. It’s a huge leap forward in making machines see like we see and make decisions based on that information.
To help us along the road to incorporating spatial AI into our projects, Erik Kokalj will stop by the Hack Chat. Erik does technical documentation and support at Luxonis, a company working on the edge of spatial AI and computer vision. Join us as we explore the depths of spatial AI.
Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, December 1st at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.
For their final project in embedded microcontroller class, [Aaheli, Jun, and Naomi] turned their focus toward assistive technology and created an Electronic Travel Aid (ETA) for the visually impaired that uses haptic feedback to report the presence of obstacles.
We have seen a few of these types of devices in the past, and they almost always use ultrasonic sensors to gauge distance. Not so with this ETA; it uses six VL53L0X time-of-flight (ToF) sensors mounted at slightly different angles from each other, which provides a wide sensing map. It is capable of detecting objects in a one-meter-wide swath at a range of one meter from the sensors.
The device consists of two parts, a wayfinding wand and a feedback module. The six ToF sensors are strapped across the end of a flashlight body and wired to an Arduino Mini inside the body. The Mini receives the sensor data over UART and sends it to the requisite PIC32, which is attached to a sleeve on the user’s forearm. The PIC decodes these UART signals into PWM and lights up six corresponding vibrating disc motors that dangle from the sleeve and form a sensory cuff bracelet around the upper forearm.
We like the use of ToF over ultrasonic for wayfinding. Whether ToF is faster or not, the footprint is much smaller, so its more practical for discreet assistive wearables. Plus, you know, lasers. You can see how well it works in the demo video after the break.
This device is intended to augment the traditional white cane, not replace it. This virtual cane we saw a few years ago is another story.
Continue reading “Find Your Way With Tiny Laser Beams” →