Palm Interface Has You Suggest Where Self Driving Car Should Go

These days, our automobiles sport glittering consoles adorned with dials and digits to keep us up-to-date with our car’s vitals. In the future, though, perhaps we just wont need such vast amounts of information at our fingertips if our cars are driving themselves around. No information? How will we tell the car what to do? On that end, [Felix] has us covered with Stewart, a tactile gesture-input interface for the modern, self driving car.

Stewart is a 6-DOF “Stewart Interface” capable of both gesture input and haptic-output. Gesture input enables the car’s passenger to deliver driving suggestions to the car. The gentle twist of a wrist can signal an upcoming turn at the next intersection; pulling back on Stewart’s head “joystick style” signals a “whoa–slow down, there, bub!” Haptic output via 6 servos pushes around Stewart’s head in the car’s intended direction.  If the passenger agrees with the car, she can let Stewart gesture itself in the desired direction; if she disagrees; she can veto the car’s choices by moving her hand directly against Stewart’s current output gesture. Overall, the interface unites the intentions of the car and the intentions of the passenger with a haptic device that makes the connection feel seamless!

We know we’re not supposed to comment on the “how” with art projects–but we’re engineers–and this one makes us giddy with delight. We’re imagining those rc car shock absorbers dramatically dampening the jittery servos and giving the user a nice resistive feel. Interconnects are laser cut acrylic, and the shell is a smoothly contoured 3d print. We’ve seen Stewart Interfaces before, but nothing with the look-and-feel of a sleek design feature on its way to being dropped into the cockpit of our future self-driving cars.

Continue reading “Palm Interface Has You Suggest Where Self Driving Car Should Go”

Art Exhibit Lets You Hide From Self-Driving Cars

In the discussions about how dangerous self-driving cars are – or aren’t – one thing is sorely missing, and that is an interactive game in which you do your best to not be recognized as a pedestrian and subsequently get run over. Even if this is a somewhat questionable take, there’s something to be said for the interactive display over at the Asian Art Museum in San Francisco which has you try to escape the tyranny of machine-vision and get recognized as a crab, traffic cone, or something else that’s not pedestrian-shaped.

Daniel Coppen, one of the artists behind “How (not) to get hit by a self-driving car,” sets up a cone at the exhibit at the Asian Art Museum in San Francisco on March 22, 2024. (Credit: Stephen Council, SFGate)
Daniel Coppen, one of the artists behind “How (not) to get hit by a self-driving car,” sets up a cone at the exhibit at the Asian Art Museum in San Francisco on March 22, 2024. (Credit: Stephen Council, SFGate)

The display ran from March 21st to March 23rd, with [Stephen Council] of SFGate having a swing at the challenge. As can be seen in the above image, he managed to get labelled as ‘fire’ during one attempt while hiding behind a stop sign as he walked the crossing. Other methods include crawling and (ab)using a traffic cone.

Created by [Tomo Kihara] and [Daniel Coppen], it’s intended to be a ‘playful, engaging game installation’. Both creators make it clear that self-driving vehicles which use LIDAR and other advanced detection methods are much harder to fool, but given how many Teslas are on the road using camera-based systems, it’s still worth demonstrating the shortcomings of the technology.

There’s no shortage of debate about whether or not autonomous vehicles are ready to share the roads with human drivers, especially when they exhibit unusual behavior. We’ve already seen protesters attempt to confuse self-driving systems with methods that aren’t far removed from what [Kihara] and [Coppen] have demonstrated here, and it seems likely such antics will only become more common with time.

Full Self-Driving, On A Budget

Self-driving is currently the Holy Grail in the automotive world, with a number of companies racing to build general-purpose autonomous vehicles that can get from point A to point B with no user input. While no one has brought one to market yet, at least one has promised this feature and had customers pay for it, but continually moved the goalposts for delivery due to how challenging this problem turns out to be. But it doesn’t need to be that hard or expensive to solve, at least in some situations.

The situation in question is driving on a single stretch of highway, and only focuses on steering, so it doesn’t handle the accelerator or brake pedal input. The highway is driven normally, using a webcam to take images of the route and an Arduino to capture data about the steering angle. The idea here is that with enough training the Arduino could eventually steer the car. But first some math needs to happen on the training data since the steering wheel is almost always not turning the car, so the Arduino knows that actual steering events aren’t just statistical anomalies. After the training, the system does a surprisingly good job at “driving” based on this data, and does it on a budget not much larger than laptop, microcontroller, and webcam.

Admittedly, this project was a proof-of-concept to investigate machine learning, neural networks, and other statistical algorithms used in these sorts of systems, and doesn’t actually drive any cars on any roadways. Even the creator says he wouldn’t trust it himself, but that he was pleasantly surprised by the results of such a simple system. It could also be expanded out to handle brake and accelerator pedals with separate neural networks as well. It’s not our first budget-friendly self-driving system, either. This one makes it happen with the enormous computing resources of a single Android smartphone.

Continue reading “Full Self-Driving, On A Budget”

Self-Driving Library For Python

Fully autonomous vehicles seem to perennially be just a few years away, sort of like the automotive equivalent of fusion power. But just because robotic vehicles haven’t made much progress on our roadways doesn’t mean we can’t play with the technology at the hobbyist level. You can embark on your own experimentation right now with this open source self-driving Python library.

Granted, this is a library built for much smaller vehicles, but it’s still quite full-featured. Known as Donkey Car, it’s mostly intended for what would otherwise be remote-controlled cars or robotics platforms. The library is built to be as minimalist as possible with modularity as a design principle, and includes the ability to self-drive with computer vision using machine-learning algorithms. It is capable of logging sensor data and interfacing with various controllers as well, either physical devices or through something like a browser.

To build a complete platform costs around $250 in parts, but most things needed for a Donkey Car compatible build are easily sourced and it won’t be too long before your own RC vehicle has more “full self-driving” capabilities than a Tesla, and potentially less risk of having a major security vulnerability as well.

Self-Driving Laboratories Do Research On Autopilot

Scientific research is a messy business. The road to learning new things and making discoveries is paved with hard labor, tough thinking, and plenty of dead ends. It’s a time-consuming, expensive endeavor, and for every success, there are thousands upon thousands of failures.

It’s a process so inefficient, you would think someone would have automated it already. The concept of the self-driving laboratory aims to do exactly that, and could revolutionize materials research in particular.

Continue reading “Self-Driving Laboratories Do Research On Autopilot”

Seoul Introduces Self-Driving Taxis

Last year the Seoul city government passed an ordinance enabling the commercial operation of autonomous passenger-carrying vehicles. A six square kilometer region in the Seoul neighborhood of Sangam, near the 2002 World Cup Stadium, was designated as a pilot program test bed. This area encompasses 24 streets totaling 31.3 km. Two companies were selected, and the pilot program launched a few weeks ago. Currently there are three vehicles and passengers can ride for free during this introductory phase. Three more taxis and a bus will be added within this year, with plans for 50 in this region by 2026. For the time being, these cars require a standby driver who takes control in an emergency and in school zones. Check out the short news report (in English) below the break.

There was a smaller autonomous driving test program in the city of Sejong which we wrote about back in January, and [Alfred Jones] gave a keynote presentation at the 2020 Hackaday Remoticon on the challenges of designing self-driving vehicles if you want to learn more on this topic.

Continue reading “Seoul Introduces Self-Driving Taxis”

Turn signal monitor

Annoy Yourself Into Better Driving With This Turn Signal Monitor

Something like 99% of the people on the road at any given moment will consider themselves an above-average driver, something that’s as statistically impossible as it is easily disproven by casual observation. Drivers make all kinds of mistakes, but perhaps none as annoying and avoidable as failure to use their turn signal. This turn signal monitor aims to fix that, through the judicious use of negative feedback.

Apparently, [Mark Radinovic] feels that he has a predisposition against using his turn signal due to the fact that he drives a BMW. To break him of that habit, one that cost him his first BMW, he attached Arduino Nano 33 BLEs to the steering wheel and the turn signal stalk. The IMUs sense the position of each and send that over Bluetooth to an Arduino Uno WiFi. That in turn talks over USB to a Raspberry Pi, which connects to the car’s stereo via Bluetooth to blare an alarm when the steering wheel is turned but the turn signal remains untouched. The video below shows it in use; while it clearly works, there are a lot of situations where it triggers even though a turn signal isn’t really called for — going around a roundabout, for example, or navigating a sinuous approach to a drive-through window.

While [Mark] clearly built this tongue firmly planted in cheek, we can’t help but think there’s a better way — sniffing the car’s CANbus to determine steering angle and turn signal status comes to mind. This great workshop on CANbus sniffing from last year’s Remoticon would be a great place to start if you’d like a more streamlined solution than [Mark]’s.

Continue reading “Annoy Yourself Into Better Driving With This Turn Signal Monitor”