[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.
The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.
Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.
You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.
[youtube=http://www.youtube.com/watch?v=6_Ornv-NlEk&w=470]
[youtube=http://www.youtube.com/watch?v=nk2xY3usY0k&w=470]
[youtube=http://www.youtube.com/watch?v=ACVx70x8mtg&w=470]
Although the LDR-M10 may send out a million pulses per second, it outputs distance through an analog signal; with a bandwidth of 50hz.
It also has a beamwidth of 20°. And isn’t scanned, so it only returns one measurement; rather than a two-dimensional map like the Kinect. While it may work better in sunlight than the Kinect, comparing the two on basis of that or an internal pulse rate you can’t take advantage of is downright silly.
Furthermore, don’t be fooled into thinking this is full gesture recognition. It cannot sense, for example, whether the user has truly closed his hand or not. It only senses the average position, in three dimensions, of what’s in front of it. If you wish to interpret that position simultaneously getting lower and a little closer as a hand closure, so be it; but actually moving your hand in that way would produce the same result. Notice the user doesn’t move his hand in this way, or much in the Z-axis at all, in the demo to avoid this result.
Rather limited for 3 x $400 = $1,200 worth of sensors. The Sharp GP2Y0A21YK IR distance sensor can do the same thing for $14 each, albeit with a lower update rate and distance range; but easily good enough to produce the results seen in the “gesture” tracking video above.
I think HAD has been “had” by Miremadi to provide some free advertising here.
word :)
Hear hear, Chris. The price tag is outrageous for what it does. This is not gesture recognition at all.
I’d like to see DIY range and speed adjusting addons to vehicle cruise control systems, something I can put on the front of my 1995 Buick Century.
‘Course it’d be best if it can hide behind the grille instead of looking like a leftover Mad Max prop. ;)
Chris,
You miss the whole point here. LDR-M10 is the fist affordable true time of flight LIDAR sensor that works at short distances with a +/- 20 degree field of view.
The GP2Y0A21YK has a +/- 4 degree FOV and will not work in this application. The transmitters from the different sensors would interfere with each other. Finally not only is it slow, it does not work in sunlight.
There are many sensors out there but none can do what LDR-M10 can do in terms of speed, FOV and sunlight capability.
This is the first affordable time of flight sensor for hobby use.
You do have a point about the Sharp sensor’s narrow FOV. Sunlight sensitivity, I’m not sure about that – so I’ll take your word for it. Interference is solved by firing them sequentially, at the expense of lower refresh rate.
No problem, use sonar instead. FOV can be adjusted by picking appropriate transducers/modules. Certainly no issues with direct sunlight. It will work at equally short distances. Matching your maximum detection range of 48″, bandwidth would be about 139hz, depending on atmospheric factors. Firing three sensors sequentially for a 3D reading yields 46hz – darn close to your bandwidth.
As you say, there are a lot of sensors out there, with overlapping applications. The question is, where does yours fit in? While there may well be a situation where this is the ideal sensor, this certainly isn’t it; the questionably named “gesture recognition” demo makes it look more like a solution in need of a problem.
LIDAR does have great potential, most of which lies in the possibility of smaller FOV and higher bandwidth for scanning. With your product’s broad +/-20° FOV and relatively slow 50hz bandwidth, it fails to deliver this potential; or even place itself in a clear category of usefulness apart from other methods.
And at $400, your product is the last “hobby” method I’d consider. Because in any given situation, there will almost always be something that will do the job cheaper, and maybe even better at the same time.
That is the real point, which I’d say you’ve missed.
Heck, for less than your product, I could even get a Neato XV-11 just to strip the LIDAR module from it. At least then I get a full 360° scan with 1° angular resolution at 10hz, for an effective 3,600hz bandwidth; which is generally going to be a lot more useful. I could care less that it’s triangulation based instead of true time-of-flight. And I’d get the rest of the robot to scavenge, as well!
for USD1200, you can already buy a scanning lidar, the Hokuyo URG-04LX-UG01.
$400 is no hobby. Use the Maxim MAX3806 TIA with a photodetector and a Osram 1 or 5W IR laser and you have a $30 LIDAR system. Works great and much less cost.
Chris,
XV-11 or ultrasonic are much slower, the whole idea was to have a responsive system.
Emantos,
URG-04LX-UG01 is five times slower, see above.
Mike,
If it was so easy, how come there are no TOF systems based on it that can work at short distances with 0.1″ resolution and +/- 20 Degree FOV. Remember you need to resolve down to 20 pico second for that resolution.
I must admit that the cost is possibly high for hobby, however, there are no other solutions at this moment. The cost can become lower in higher volume if I ever get there.
I do concur that most LIDAR systems out there are designed for a narrow beam and longer distance. My goal was to make it wide beam, fast and eye safe for object tracking in sunlight, and I believe I achieved that.
I agree you’ve met your goal. But – I’ll say it one more time – your goal is a set of properties which isn’t particularly useful, and can be achieved with other methods.
I’ve clearly demonstrated ultrasonic can perform your triple-sensor demo, at the same maximum range (both being limiting factors) at 46hz; versus 50hz for your LIDAR. Which you responded to by calling “much slower”, a bit of hyperbole on your part when it’s a mere 9% slower. I can say without any hyperbole however, that ultrasonic is much cheaper.
Here’s my advice – come up with an alternate version.
First, increase the bandwidth. Chances are the output will have more noise, jittering around the true reading; and that’s ok. If the user wishes, they can use a simple RC network to filter the output for increased accuracy at the expense of speed; but do NOT make that decision for them as you’re currently doing.
Second, shrink the FOV to make it more scanning-friendly. Sacrifice sunlight rejection if needed. Be honest, what would it be used for anyway? Can you name even one major practical application for which this is clearly superior? For example, you mentioned UAVs as a potential application on your site. But ultrasonic can already do that, so why would someone disadvantage themselves by using a larger, heavier, more power-hungry (900ma!), and more expensive LIDAR?
Then you’ll have a module that appeals to more people and applications. Which in turn will enable you to lower the price, as your sales volume grows.
Chris – what do you gain by blasting the marketability of Reza’s sensor on HAD? You seem to have a specific issue with the product. Did he do some wrong by you? Did you buy one and it didn’t work? The video was about using an Arduino with a high-bandwidth distance sensor to provide some semblance of a gesture input. That’s something I personally hadn’t seen before, even if it is just a parlor trick.
It’s pretty clear that it’s not useful to you, but extending that to the assumption that it’s not useful to anyone is a bit of a stretch. There are applications where this might be useful, they just clearly are not your applications.
Sure, the price is up there. Increased bandwidth might make it more appealing to me, even at the expense of noise. If I were to use it on my balancing robot, I would want to scan it more quickly than is shown in his videos. However, you can’t generally operate 3 ultrasonics looking into the same field like this without interference. Ultrasonic can be very dependent on the environment, specifically if there is platform vibration and noise (like heavy machinery or aircraft). The resolution of ultrasonic isn’t always what’s desired, but the higher range is nice. They’re a mass-marketed part, so the price is now low after being a commodity for 15 or so years. Sharp sensors don’t work in applications where there is little reflectivity. Of course they’re also a mass-market product that’s been out for 15 years.
The last part of your second comment was finally constructive, by offering some advice on how to make it better. Before that I was certain you were just selling ultrasonic sensors. Since you’re just an anonymous “Chris” I wouldn’t know otherwise. Do you have a website where the results of your copius hacking adventures live?
Never seen Reza, his product, or website before.
So this was my first impression, represented as a LIDAR doing gesture tracking. Except it’s not *really* gesture tracking, but as you said, a “parlor trick” trying to pass for it. Immediate bad first impression.
Also, let’s be brutally honest. Most of what’s out there is junk. Products, websites, tutorials, you name it. Sucks, but that’s the way it is. You just have to wade through the junk if you want to find the gems.
What disappoints most of all is when something had the *potential* to be a gem, but was crippled by a couple of bad design decisions. What do you hope for in a LIDAR? It’s obvious by all the comments here that this isn’t it. The only positive comment here is from aswdf; and even he will be disappointed to hear that the reason the module was pointed away from the sun is because it’s, according to the website, only resistant to “indirect sunlight”.
Yes, assuming this isn’t useful to *anyone* is a stretch. That’s really not the impression I tried to make. But I think it’s clear that Reza missed the majority of his potential market.
And I agree with your comments on the limitations of IR/ultrasonic sensors. I maintain that three ultrasonic sensors can be used together by firing them in series, with appropriate delay and detection threshold. Some other limitations could be overcome too. Chirped sonar is something that hasn’t penetrated the hobby market yet. Scanning mirror aside, the Neato’s LIDAR works very similarly to the Sharp IR ranger, but with much better specs. They say it could be made for $30 on its own, a shame they won’t market it separately.
I suppose I am an anonymous “Chris”. I most certainly have no financial interest in ultrasonic sensors though! I don’t have a website by choice, and claim I can’t make them; otherwise all my friends/co-workers would be hitting me up for free website design in addition to the “fix my ” requests, which keep me busy enough! Besides, I recognize that many of my hacks fall into the 90% that, according to my own standards, I consider junk anyway. :) And others I think that even if I took the time to detail it, I’d get some good comments, but ultimately be built or adapted by no one. Still, I’ve published a few here and there. Google “dimmable T5HO hood” if you want an example.
Chris,
The power consumption for LDR-M10 is 900 mW and not 900mA. At 11 Volts the sensor draws 75 mA typical.
Reza, impressive.
Please post a video that shows the module detect distance while pointed toward the sun. The robot in the sun was neat, but since it was pointed away from the sun it doesn’t really demonstrate the module’s resistance to flood-out.
I noticed there was a very very narrow plane where you hand could be while detecting x and y location. Could this plane be easily ‘deepened’? Although the title says ‘gesture recognition’ it appears (and correct me if I’m wrong) the sensor’s wide viewing angle wouldn’t be capable of detecting gestures. Would a narrow viewing angle scanning LIDAR system be required for gesture recognition?
aswdf,
A gesture is defined as “a movement of part of the body, esp. a hand or the head, to express an idea”, for me creating a mouse movement is a gesture, especially one with lack of jitter. Apparently other people have different idea what a gesture is. If you need to detect single fingers then you are correct and you need to resort to narrow viewing LIDAR.
As you know there is no sensor that can work if pointed directly towards the sun (including your eye). This is why I state indirect sunlight in data sheet. I will try to do the hand motion tracking outside and see if I can video it. I am not sure when I will get to it ,however, I will update this link when I do. I can assure you that the sensor has been aimed at the ground in full sunshine in the summer and all the reflected suns energy in the FOV has not caused saturation on multiple surfaces.
Finally the Zaxis is currently utilized so that if you are too close to the sensors (saturation) the screen turns red (this did not happen in the video). On the other hand a few inches back it is set to blank out the rectangle depending on energy detected. I have not had the chance to calibrate everything, but one thing I noticed is that the further back you go the noisier the rectangle gets (due to S/N and algorithm). Again I will post video’s when I make more improvements, however I can not commit to a date.
Finally thanks for showing interest and remember if you buy a sensor, you have 30 days to return it if not satisfied. Since I do not know your exact application, it is sometimes easier just to run an experiment.
Again thanks for the encouraging words.