An Absolute Zero Of A Project

How would you go about determining absolute zero? Intuitively, it seems like you’d need some complicated physics setup with lasers and maybe some liquid helium. But as it turns out, all you need is some simple lab glassware and a heat gun. And a laser, of course.

To be clear, the method that [Markus Bindhammer] describes in the video below is only an estimation of absolute zero via Charles’s Law, which describes how gases expand when heated. To gather the needed data, [Marb] used a 50-ml glass syringe mounted horizontally on a stand and fitted with a thermocouple. Across from the plunger of the syringe he placed a VL6180 laser time-of-flight sensor, to measure the displacement of the plunger as the air within it expands.

Data from the TOF sensor and the thermocouple were recorded by a microcontroller as the air inside the syringe was gently heated. Plotting the volume of the gas versus the temperature results shows a nicely linear relationship, and the linear regression can be used to calculate the temperature at which the volume of the gas would be zero. The result: -268.82°C, or only about four degrees off from the accepted value of -273.15°. Not too shabby.

[Marb] has been on a tear lately with science projects like these; check out his open-source blood glucose measurement method or his all-in-one electrochemistry lab.

Continue reading “An Absolute Zero Of A Project”

Reconstructing 3D Objects With A Tiny Distance Sensor

There are a whole bunch of different ways to create 3D scans of objects these days. Researchers at the [UW Graphics Lab] have demonstrated how to use a small, cheap time-of-flight sensor to generate scans effectively.

Not yet perfect, but the technique does work…

The key is in how time-of-flight sensors work. They shoot out a distinct pulse of light, and then determine how long that pulse takes to bounce back. This allows them to perform a simple ranging calculation to determine how far they are from a surface or object.

However, in truth, these sensors aren’t measuring distance to a single point. They’re measuring the intensity of the received return pulse over time, called the “transient histogram”, and then processing it. If you use the full mathematical information in the histogram, rather than just the range figures, it’s possible to recreate 3D geometry as seen by the sensor, through the use of some neat mathematics and a neural network. It’s all explained in great detail in the research paper.

The technique isn’t perfect; there are some inconsistencies with what it captures and the true geometry of the objects its looking at. Still, the technique is young, and more work could refine its outputs further.

If you don’t mind getting messy, there are other neat scanning techniques out there—like using a camera and some milk.

Continue reading “Reconstructing 3D Objects With A Tiny Distance Sensor”

Time-of-Flight Sensors: How Do They Work?

With the right conditions, this tiny sensor can measure 12 meters

If you need to measure a distance, it is tempting to reach for the ubiquitous ultrasonic module like an HC-SR04. These work well, and they are reasonably easy to use. However, they aren’t without their problems. So maybe try an IR time of flight sensor. These also work well, are reasonably easy to use, and have a different set of problems. I recently had a project where I needed such a sensor, and I picked up a TF-MiniS, which is a popular IR distance sensor. They aren’t very expensive, and they work serial or I2C. So how did it do?

The unit itself is tiny and has good specifications. You can fit the 42 x 15 x 16 mm module anywhere. It only weighs about five grams — as the manufacturer points out, less than two ping-pong balls. It needs 5 V but communicates using 3.3 V, so integration isn’t much of a problem.

At first glance, the range is impressive. You can read things as close as 10 cm and as far away as 12 m. I found this was a bit optimistic, though. Although the product sometimes gets the name of LiDAR, it doesn’t use a laser. It just uses an IR LED and some fancy optics.

Continue reading “Time-of-Flight Sensors: How Do They Work?”

Playing Rock, Paper Scissors With A Time Of Flight Sensor

You can do all kinds of wonderful things with cameras and image recognition. However, sometimes spatial data is useful, too. As [madmcu] demonstrates, you can use depth data from a time-of-flight sensor for gesture recognition, as seen in this rock-paper-scissors demo.

If you’re unfamiliar with time-of-flight sensors, they’re easy enough to understand. They measure distance by determining the time it takes photons to travel from one place to another. For example, by shooting out light from the sensor and measuring how long it takes to bounce back, the sensor can determine how far away an object is. Take an array of time-of-flight measurements, and you can get simple spatial data for further analysis.

The build uses an Arduino Uno R4 Minima, paired with a demo board for the VL53L5CX time-of-flight sensor. The software is developed using NanoEdge AI Studio. In a basic sense, the system uses a machine learning model to classify data captured by the time-of-flight sensor into gestures matching rock, paper, or scissors—or nothing, if no hand is present. If you don’t find [madmcu]’s tutorial enough, you can take a look at the original version from STMicroelectronics, too.

It takes some training, and it only works in the right lighting conditions, but this is a functional system that can determine real hand sign and play the game. We’ve seen similar techniques help more advanced robots cheat at this game before, too! What a time to be alive.

Laser Theremin Turns Your Hand Swooshes Into Music

In a world where smartphones have commoditized precision MEMS Sensors, the stage is set to reimagine clusters of these sensors as something totally different. That’s exactly what [chronopoulos] did, taking four proximity sensors and turning them into a custom gesture input sensor for sound generation. The result is Quadrant, a repurposable human-interface device that proves to be well-posed at detecting hand gestures and turning them into music.

At its core, Quadrant is a human interface device built around an STM32F0 and four VL6180X time-of-flight proximity sensors. The idea is to stream the measured distance data over as fast as possible from the device side and then transform it into musical interactions on the PC side. Computing distance takes some time, though, so [chronopoulos] does a pipelined read of the array to stream the data into the PC over USB at a respectable 30 Hz.

With the data collected on the PC side, there’s a spread of interactions that are possible. Want a laser harp? No problem, as [chronopoulos] shows how you can “pluck” the virtual strings. How about an orientation sensor? Simply spread your hand over the array and change the angle. Finally, four sensors will also let you detect sweeping gestures that pass over the array, like the swoosh of your hand from one side to the other. To get a sense of these interactions, jump to the video demos at the 2:15 mark after the break.

If you’re curious to dig into the project’s inner workings, [chronopoulos] has kindly put the firmware, schematics, and layout files on Github with a generous MIT License. He’s even released a companion paper [PDF] that details the math behind detecting these gestures. And finally, if you just want to cut to the chase and make music of your own, you can actually snag this one on Tindie too.

MEMs sensors are living a great second life outside our phones these days, and this project is another testament to the richness they offer for new project ideas. For more MEMs-sensor-based projects, have a look at this self-balancing robot and magic wand.

Continue reading “Laser Theremin Turns Your Hand Swooshes Into Music”

Lidar House Looks Good, Looks All Around

A lighthouse beams light out to make itself and its shoreline visible. [Daniel’s] lighthouse has the opposite function, using lasers to map out the area around itself. Using an Arduino and a ToF sensor, the concept is relatively simple. However, connecting to something that rotates 360 degrees is always a challenge.

The lighthouse is inexpensive — about $40 — and small. Small enough, in fact, to mount on top of a robot, which would give you great situational awareness on a robot big enough to support it. You can see the device in action in the video below. Continue reading “Lidar House Looks Good, Looks All Around”

A Tongue Operated Human Machine Interface

For interfacing with machines, most of us use our hands and fingers. When you don’t have use of your hands (permanently or temporarily), there are limited alternatives. [Dorothee Clasen] has added one more option, [In]Brace, which is basically a small slide switch that you can operate with your tongue.

[In]Brace consists of a custom moulded retainer for the roof of your mouth, on which is a small ball with an embedded magnet, that slides long wire tracks. Above the track is a set of three magnetic sensors, that can detect the position of the ball. On the prototype, a wire from the three sensors run out of the corner of the users mouth, to a wireless microcontroller (Which looks to us like a ESP8266) hooked behind the user’s ear. In a final product, it would obviously be preferable if everything were sealed in the retainer. We think there is even more potential if one of the many 3-axis hall effect sensors are used, with a small joystick of rolling ball. The device could be used by disabled persons, for physical therapy, or just for cases where a person’s hands are otherwise occupied. [Dorothy] created a simple demonstration, where she plays Pong, or Tong in this case, using only the [In]Brace. Hygiene and making sure that it doesn’t somehow become a choke hazard will be very important if this ever became a product, but we think there is some potential.

[Kristina Panos] did a very interesting deep dive into the tongue as an HMI device a while ago, so this isn’t a new idea, but the actual implementations differ quite a lot. Apparently it’s also possible to use your ear muscles as an interface!

Thanks for the tip [Itay]!