When Will Our Cars Finally Speak The Same Language? DSRC For Vehicles

At the turn of the 21st century, it became pretty clear that even our cars wouldn’t escape the Digital Revolution. Years before anyone even uttered the term “smartphone”, it seemed obvious that automobiles would not only become increasingly computer-laden, but they’d need a way to communicate with each other and the world around them. After all, the potential gains would be enormous. Imagine if all the cars on the road could tell what their peers were doing?

Forget about rear-end collisions; a car slamming on the brakes would broadcast its intention to stop and trigger a response in the vehicle behind it before the human occupants even realized what was happening. On the highway, vehicles could synchronize their cruise control systems, creating “flocks” of cars that moved in unison and maintained a safe distance from each other. You’d never need to stop to pay a toll, as your vehicle’s computer would communicate with the toll booth and deduct the money directly from your bank account. All of this, and more, would one day be possible. But only if a special low-latency vehicle to vehicle communication protocol could be developed, and only if it was mandated that all new cars integrate the technology.

Except of course, that never happened. While modern cars are brimming with sensors and computing power just as predicted, they operate in isolation from the other vehicles on the road. Despite this, a well-equipped car rolling off the lot today is capable of all the tricks promised to us by car magazines circa 1998, and some that even the most breathless of publications would have considered too fantastic to publish. Faced with the challenge of building increasingly “smart” vehicles, manufacturers developed their own individual approaches that don’t rely on an omnipresent vehicle to vehicle communication network. The automotive industry has embraced technology like radar, LiDAR, and computer vision, things which back in the 1990s would have been tantamount to saying cars in the future would avoid traffic jams by simply flying over them.

In light of all these advancements, you might be surprised to find that the seemingly antiquated concept of vehicle to vehicle communication originally proposed decades ago hasn’t gone the way of the cassette tape. There’s still a push to implement Dedicated Short-Range Communications (DSRC), a WiFi-derived protocol designed specifically for automotive applications which at this point has been a work in progress for over 20 years. Supporters believe DSRC still holds promise for reducing accidents, but opponents believe it’s a technology which has been superseded by more capable systems. To complicate matters, a valuable section of the radio spectrum reserved for DSRC by the Federal Communications Commission all the way back in 1999 still remains all but unused. So what exactly does DSRC offer, and do we really still need it as we approach the era of “self-driving” cars?

Continue reading “When Will Our Cars Finally Speak The Same Language? DSRC For Vehicles”

Open Source LIDAR Lets You Get Down To The Nitty Gritty

If you’re unfamiliar with LIDAR, you might have noticed it sounds a bit like radar. That’s no accident – LIDAR is a backronym standing for “light detection and ranging”, the word having initially been created as a combination of “light” and “radar”. The average person is most likely to have come into contact with LIDAR at the business end of a police speed trap, but it doesn’t have to be that way. Unruly is the open source LIDAR project you’ve been waiting for all along.

Unlike a lot of starter projects, LIDAR isn’t something you get into with a couple of salvaged LEDs and an Arduino Uno. We’re talking about measuring the time it takes light to travel relatively short distances, so plenty of specialised components are required. There’s a pulsed laser diode, and a special hypersensitive avalanche photodiode that operates at up to 130 V. These are combined with precision lenses and filters to ensure operation at the maximum range possible. Given that light can travel 300,000 km in a second, to get any usable resolution, a microcontroller alone simply isn’t fast enough to cut it here. A specialized  time-to-digital converter (TDC) is used to time how long it takes the light pulse to return from a distant object. Unruly’s current usable resolution is somewhere in the ballpark of 10 mm – an impressive feat.

It’s a complicated project, requiring the utmost attention to detail to get any results at all. The team behind Unruly have done a great job of both designing and documenting the project. It’s great to see an open source LIDAR package in the wild, giving hackers more options than just the pre-baked commercial modules on the market. We can’t wait to see where the project goes next.

For more on LIDAR, check out last week’s Hackaday podcast – we cover Unruly, as well as a handful of other standout projects in the field.

Hackaday Podcast Ep3 – Igloos, Lidar, And The Blinking LED Of RF Hacking

It’s cold outside! So grab a copy of the Hackaday Podcast, and catch up on what you missed this week.

Highlights include a dip into audio processing with sox and FFMPEG, scripting for Gmail, weaving your own carbon fiber tubes, staring into the sharpest color CRT ever, and unlocking the secrets of cheap 433 MHz devices. Plus Elliot talks about his follies in building an igloo while Mike marvels at what’s coming out of passive RFID sensor research.

And what’s that strange noise at the end of the podcast?

Direct Download (59.2 MB MP3)

Places to follow Hackaday podcasts:

Continue reading “Hackaday Podcast Ep3 – Igloos, Lidar, And The Blinking LED Of RF Hacking”

New Part Day: Small, Cheap, And Good LIDAR Modules

Fully autonomous cars might never pan out, but in the meantime we’re getting some really cool hardware designed for robotic taxicab prototypes. This is the Livox Mid-40 Lidar, a LIDAR module you can put on your car or drone. The best part? It only costs $600 USD.

The Livox Mid-40 and Mid-100 are two modules released by Livox, and the specs are impressive: the Mid-40 is able to scan 100,000 points per second at a detection range of 90 m with objects of 10% reflectivity. The Mid-40 sensor weighs 710 grams and comes in a package that is only 88 mm x 69 mm x 76 mm. The Mid-100 is basically the guts of three Mid-40 sensors stuffed into a larger enclosure, capable of 300,000 points per second, with a FOV of 98.4° by 38.4°.

The use case for these sensors is autonomous cars, (large) drones, search and rescue, and high-precision mapping. These units are a bit too large for a skateboard-sized DIY Robot Car, but a single Livox Mid-40 sensor, pointed downward on a reasonably sized drone could perform aerial mapping

There is one downside to the Livox Mid sensors — while you can buy them direct from the DJI web site, they’re not in production. These sensors are only, ‘Mass-Production ready’. This might be just Livox testing the market before ramping up production, a thinly-veiled press release, or something else entirely. That said, you can now buy a relatively cheap LIDAR module that’s actually really good.

Hackaday Links: January 20, 2019

Let’s say you’re an infosec company, and you want some free press. How would you do that? The answer is Fortnite. Yes, this is how you hack Fortnite. This is how to hack Fortnite. The phrase ‘how to hack Fortnite’ is a very popular search term, and simply by including that phrase into the opening paragraph of this post guarantees more views. This is how you SEO.

Lasers kill cameras. Someone at CES visited the AEye booth, snapped a picture of an autonomous car at AEye’s booth, and the LIDAR killed the sensor. Every subsequent picture had a purple spot in the same place. While we know lasers can kill camera sensors, and this is a great example of that, this does open the door to a few questions: if autonomous cars have LIDAR and are covered in cameras, what’s going to happen to the cameras in an autonomous car driving beside another autonomous car? Has anyone ever seen more than one Cruise or Waymo car in the same place at the same time? As an aside, AEye’s company website’s URL is aeye.ai, nearly beating penisland.net (they sell pens on Pen Island) as the worst company URL ever.

This is something I’ve been saying for years, but now there’s finally a study backing me up. Lego is a viable investment strategy. An economist at Russia’s Higher School of Economics published a study, collecting the initial sale price of Lego sets from 1987 to 2015. These were then compared to sales of full sets on the secondary market. Returns were anywhere between 10 and 20% per year, which is crazy. Smaller sets (up to about 100 pieces) had higher returns than larger sets. This goes against my previous belief that a Hogwarts Castle, Saturn V, and UCS Falcon-heavy portfolio would outperform a portfolio made of cheap Lego sets. However, this observation could be tied to the fact that smaller sets included minifig-only packaging, and we all know the Lego minifig market is a completely different ball of wax. The Darth Revan minifig, sold as an exclusive for $3.99 just a few years ago, now fetches $35 on Bricklink. Further study is needed, specifically to separate the minifig market from the complete set market, but the evidence is coming in: Lego is a viable investment strategy, even when you include the 1-2% yearly cost of storing the sets.

Relativity Space got a launchpad. Relativity Space is an aerospace startup that’s building a rocket capable of lobbing my car into Low Earth Orbit with a methalox engine. They’re doing it with 3D printing. [Bryce Salmi], one of the hardware engineers at Relativity Space, recently gave a talk at the Hackaday Superconference about printing an entire rocket. The design is ambitious, but if there’s one device that’s perfectly suited for 3D printing, it’s a rocket engine. There are a lot of nonmachinable tubes going everywhere in those things.

Fail Of The Week: How Not To Make A 3D Scanner

Sometimes the best you can say about a project is, “Nice start.” That’s the case for this as-yet awful DIY 3D scanner, which can serve both as a launching point for further development and a lesson in what not to do.

Don’t get us wrong, we have plenty of respect for [bitluni] and for the fact that he posts his failures as well as his successes, like composite video and AM radio signals from an ESP32. He used an ESP8266 in this project, which actually uses two different sensors: an ultrasonic transducer, and a small time-of-flight laser chip. Each was mounted to a two-axis scanner built from hobby servos and 3D-printed parts. The pitch and yaw axes move the sensors through a hemisphere gathering data, but unfortunately, the Wemos D1 Mini lacks the RAM to render the complete point cloud from the raw points. That’s farmed out to a WebGL page. Initial results with the ultrasonic sensor were not great, and the TOF sensor left everything to be desired too. But [bitluni] stuck with it, and got a few results that at least make it look like he’s heading in the right direction.

We expect he’ll get this sorted out and come back with some better results, but in the meantime, we applaud his willingness to post this so that we can all benefit from his pain. He might want to check out the results from this polished and pricey LIDAR scanner for inspiration.

Continue reading “Fail Of The Week: How Not To Make A 3D Scanner”

XLIDAR Is A Merry-Go-Round Of Time-Of-Flight Sensors

[JRodrigo]’s xLIDAR project is one of those ideas that seemed so attractively workable that it went directly to a PCB prototype without doing much stopping along the way. The concept was to mount a trio of outward-facing VL53L0X distance sensors to a small PCB disk, and then turn that disk with a motor and belt while taking readings. As the sensors turn, their distance readings can be used to paint a picture of the immediate surroundings (at least within about 1 meter, which is the maximum range of the VL53L0X.)

The hardware is made to be accessible and has a strong element of “what you see is what you get.” The distance sensors are on small breakout boards, and the board turns the sensor disk via a DC motor and 3D printed belt drive. Even the method of encoding the disk’s movement and zero position has the same WYSIWYG straightforwardness: a spring contact and an interrupted bare copper trace on the bottom of the sensor disk acts as a physical switch. In fact, exposed copper traces in concentric circular patterns and spring pins taken from an SD card socket are what provide power and communications as the disk turns.

The prototype looks good and sounds like it should work, but how well does it hold up? We’ll find out once [JRodrigo] does some testing. Until then, the board designs are available on the project’s GitHub repository if anyone wants to take a shot at their own approach without starting from scratch.