The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.
Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.
You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.
The LIDAR module is made of two big chunks; the laser and optic assembly, and the sensor board seen above. [Hash] put it under the microscope for a better look at the line scan imager. The magnification helped him find the company name on the die, this particular part is manufactured by Panavision. He figured out the actual model by counting the bonding wires and pixels in between them to get a pretty good guess at the resolution. He’s pretty sure it’s a DLIS-2K and links to an app note and the datasheet in his post. The chip to the right of the sensor is a TI digital signal processor.
Putting it back together may prove difficult because it will be impossible to realign the optics exactly as they were–the module will need to be recalibrated. [Hash] plans to investigate how the calibration routines work and he’ll post anything that he finds. Check out his description of the tear down in the video after the break.
Google’s showing off this autonomous car at the TED convention right now, but the hardware has already made automated trips from San Fransisco to Los Angeles. According to the commentary in the video after the break, the scene above shows the car “hauling Prius ass” on a closed course. The car learned this route while being driven by a person and now the vehicle is set to take riders through an aggressively driven loop in the cone-adorned parking ramp. But on the open road you do not need to teach it anything. It has no problem taking a GPS route and following the rules of the road while traveling from one waypoint to another.
The link above doesn’t include hardware information but they did point to a Times article which includes an infographic. The spinning box on the top of the car is 3D-mapping LIDAR with a 200 foot radius. There’s a rotary encoder on one of the wheels for precise movement data, radar sensors on the front and back bumpers, and a rear-view-mirror-mounted camera for image processing. It makes us wonder how the system performs when the car is coated in road-muck? Maybe you just add a dedicated wiper for each sensor.
Inspired by the successful Kinect bounty put out by Adafruit, [gallamine] of the RobotBox community has posted his own $200$400 bounty for the first person who can hack the scanning LIDAR from Neato Robotic’s XV-11 vacuumbot. This sensor would be particularly useful to any robotic makers out there, because even the full retail price of the vacuum is less than the cost of most standalone LIDAR units, which often run upwards of $1000. The bounty seems to be growing every day, starting out at $200, and doubling thanks to a couple of other interested parties.
Luckily, from what we hear, the sensor was never made to be hack-proof (and perhaps even secretly hack friendly?), seeing as one of the prime developers of the sensor is a member of a certain Home Brew Robotics Club. We love it when companies are nice to hackers, and we hope to see more examples of this in the future. Not sure what the XV-11 is? Be sure to check out the video after the break for info about the vacuum and its scanning LIDAR.
[Blacklight99] made this cool tool. It is a tester for those radar detectors that people keep in their cars. Though this seems like it would rarely be a tool we would need, it’s an interesting project. Some speed guns that the police use have a “stealth” mode that makes them invisible to some detectors. This tool can tell you if your detector is vulnerable to this. While this really is just a complicated flashing LED, he notes that it could be taken further to be made into a detector that is programmable and not vulnerable to any of the stealth modes.
A team at UNC Charlotte has been working on an autonomous vehicle to drag a cart that has sensing equipment. Starting with a stock Honda ATV, different systems were added to give a Renesas processor control of the ATV. A model airplane receiver was attached to the Renesas to give remote control for Phase 1 of the project. Basically they’ve turned the ATV into a giant remote controlled car.
Later revisions will incorporate LIDAR, cameras, and multiple GPS units so the ATV can autonomously traverse most terrain with a high level of accuracy. Path planning will become a large part of the project at that point.
Radiohead has released their music video for “House Of Cards.” We’ve already covered some of the tech involved. If making an entire video without cameras wasn’t edgy enough, they’ve released all the point data for people to play with and remix. The band is encouraging people to post their creations to their YouTube group.