Turn signal monitor

Annoy Yourself Into Better Driving With This Turn Signal Monitor

Something like 99% of the people on the road at any given moment will consider themselves an above-average driver, something that’s as statistically impossible as it is easily disproven by casual observation. Drivers make all kinds of mistakes, but perhaps none as annoying and avoidable as failure to use their turn signal. This turn signal monitor aims to fix that, through the judicious use of negative feedback.

Apparently, [Mark Radinovic] feels that he has a predisposition against using his turn signal due to the fact that he drives a BMW. To break him of that habit, one that cost him his first BMW, he attached Arduino Nano 33 BLEs to the steering wheel and the turn signal stalk. The IMUs sense the position of each and send that over Bluetooth to an Arduino Uno WiFi. That in turn talks over USB to a Raspberry Pi, which connects to the car’s stereo via Bluetooth to blare an alarm when the steering wheel is turned but the turn signal remains untouched. The video below shows it in use; while it clearly works, there are a lot of situations where it triggers even though a turn signal isn’t really called for — going around a roundabout, for example, or navigating a sinuous approach to a drive-through window.

While [Mark] clearly built this tongue firmly planted in cheek, we can’t help but think there’s a better way — sniffing the car’s CANbus to determine steering angle and turn signal status comes to mind. This great workshop on CANbus sniffing from last year’s Remoticon would be a great place to start if you’d like a more streamlined solution than [Mark]’s.

Continue reading “Annoy Yourself Into Better Driving With This Turn Signal Monitor”

A Self-Driving Bicycle Is Something To Marvel At

One of the most annoying things about bicycles is that they don’t stay up on their own, especially when they’re stationary. That’s why they come with stands, after all. That said, if you had plenty of advanced electronic and mechanical equipment fitted to one, you could do something about that, and that’s just what [稚晖君] did.

The video of the project comes without subtitles or any translation, but the gist of it is this. A reaction wheel is fitted to the seat tube, along with a motor which can turn the handlebars via a linkage attached to the head stem. There’s also a motor to drive the bicycle forward via a friction drive to the rear wheel. Combine these with an inertial measurement unit and suitable control system, and you have a bike that can balance while standing perfectly still.

The performance of the system is impressive, and is even able to hold the bike perfectly upright while balanced on a fence rail. Thanks to an onboard camera and LIDAR system, the bike can also drive itself around with no rider on board, which is quite a spooky image. Find a way to do the same while hiding the extra mechanics and you’d have one hell of a Halloween display.

Similar projects have been attempted in the past; we featured a self-balancing bike built as a university project back in the distant past of 2012. Video after the break.

Continue reading “A Self-Driving Bicycle Is Something To Marvel At”

Self-Driving Or Mind Control? Which Do You Prefer?

We know you love a good biohack as much as we do, so we thought you would like [Tony’s] brainwave-controlled RC truck. Instead of building his own electroencephalogram (EEG), he thought he would use NeuroSky’s MindWave. EEGs are pretty complex, multi-frequency waves that require some fairly sophisticated circuitry and even more sophisticated signal processing to interpret. So, [Tony] thought it would be nice to off-load a bit of that heavy-lifting, and luckily for him, the MindWave headset is fairly hacker-friendly.

EEGs are a very active area of research, so some of the finer details of the signal are still being debated. However, It appears that attention can be quantified by measuring alpha waves which are EEG content between 8-10 Hz. And it seems as though eye blinks can be picked from the EEG as well. Conveniently, the MindWave exports these energy levels to an accompanying smartphone application which [Tony] then links to his Arduino over Bluetooth using the ever-so-popular HC-05 module.

To control the car, he utilized the existing remote control instead of making his own. Like most people, [Tony] thought about hooking up the Arduino pins to the buttons on the remote control, thereby bypassing the physical buttons, but he noticed the buttons were a bit smaller than he was comfortable soldering to and he didn’t want to risk damaging the circuit board. [Tony’s] RC truck has a pistol grip transmitter, which inspired a slightly different approach. He mounted the servo onto the controller’s wheel mechanism, allowing him to control the direction of the truck by rotating the wheel using the servo. He then fashioned another servo onto the transmitter such that the servo could depress the throttle when it rotates. We thought that was a pretty nifty workaround.

Cool project, [Tony]! We’ve seen some cool EEG Hackaday Prize entries before. Maybe this could be the next big one.

Continue reading “Self-Driving Or Mind Control? Which Do You Prefer?”

Uber Traded Away Its In-House Self-Driving Effort

Perhaps the best-known ridesharing service, Uber has grown rapidly over the last decade. Since its founding in 2009, it has expanded into markets around the globe, and entered the world of food delivery and even helicopter transport.

One of the main headline research areas for the company was the development of autonomous cars, which would revolutionize the company’s business model by eliminating the need to pay human drivers. However, as of December, the company has announced that it it spinning off its driverless car division in a deal reportedly worth $4 billion, though that’s all on paper — Uber is trading its autonomous driving division, and a promise to invest a further $400 million, in return for a 26% share in the self-driving tech company Aurora Innovation.

Playing A Long Game

Uber’s self-driving efforts have been undertaken in close partnership with Volvo in recent years.

Uber’s driverless car research was handled by the internal Advanced Technologies Group, made up of 1,200 employees dedicated to working on the new technology. The push to eliminate human drivers from the ride-sharing business model was a major consideration for investors of Uber’s Initial Public Offering on the NYSE in 2019. The company is yet to post a profit, and reducing the amount of fares going to human drivers would make it much easier for the company to achieve that crucial goal.

However, Uber’s efforts have not been without incident. Tragically, in 2018, a development vehicle running in autonomous mode hit and killed a pedestrian in Tempe, Arizona. This marked the first pedestrian fatality caused by an autonomous car, and led to the suspension of on-road testing by the company. The incident revealed shortcomings in the company’s technology and processes, and was a black mark on the company moving forward.

The Advanced Technology Group (ATG) has been purchased by a Mountain View startup by the name of Aurora Innovation, Inc. The company counts several self-driving luminaries amongst its cofounders. Chris Urmson, now CEO, was a technical leader during his time at Google’s self-driving research group. Drew Bagnell worked on autonomous driving at Uber, and Sterling Anderson came to the startup from Tesla’s Autopilot program. The company was founded in 2017, and counts Hyundai and Amazon among its venture capital investors.

Aurora could also have links with Toyota, which also invested in ATG under Uber’s ownership in 2019. Unlike Uber, which solely focused on building viable robotaxis for use in limited geographical locations, the Aurora Driver, the core of the company’s technology, aims to be adaptable to everything from “passenger sedans to class-8 trucks”.

Aurora has been developing self-driving technology to handle real-world situations since its founding in 2017. Being able to master the challenges of a crowded city will be key to succeeding in the marketplace.

Getting rid of ATG certainly spells the end of Uber’s in-house autonomous driving effort, but it doesn’t mean they’re getting out of the game. Holding a stake in Aurora, Uber still stands to profit from early investment, and will retain access to the technology as it develops. At the same time, trading ATG off to an outside firm puts daylight between the rideshare company and any negative press from future testing incidents.

Even if Aurora only retains 75% of ATG’s 1,200 employees, it’s doubling in size, and will be worth keeping an eye on in the future.

Alfred Jones Talks About The Challenges Of Designing Fully Self-Driving Vehicles

The leap to self-driving cars could be as game-changing as the one from horse power to engine power. If cars prove able to drive themselves better than humans do, the safety gains could be enormous: auto accidents were the #8 cause of death worldwide in 2016. And who doesn’t want to turn travel time into something either truly restful or alternatively productive?

But getting there is a big challenge, as Alfred Jones knows all too well. The Head of Mechanical Engineering at Lyft’s level-5 self-driving division, his team is building the roof racks and other gear that gives the vehicles their sensors and computational hardware. In his keynote talk at Hackaday Remoticon, Alfred Jones walks us through what each level of self-driving means, how the problem is being approached, and where the sticking points are found between what’s being tested now and a truly steering-wheel-free future.

Check out the video below, and take a deeper dive into the details of his talk.

Continue reading “Alfred Jones Talks About The Challenges Of Designing Fully Self-Driving Vehicles”

Open Source Self-Driving Smartphone Robot

Our smartphones are incredibly powerful computers in their own right, yet we don’t often see them directly integrated into projects. Intel Intelligent Systems Lab has done exactly that with the release OpenBot, an open source smartphone based self-driving robot.

Most of the magic happens on the smartphone, which runs an app built on TensorFlow Lite, and integrates the camera and array of sensors on the smartphone, as well as the data from ultrasonic sensors and wheel encoders on the robot. The robot itself is relatively simple, with four geared DC motors, motor drivers wired to an Arduino Nano that interfaces with an Android Phone over serial.

The app created by the Intel ISL team comes preloaded with three AI models that can do either person following, or two different modes of autonomous navigation. By connecting a Bluetooth controller to the smartphone and drive the robot around manually in your specific environment while collecting data, you can train a custom autonomous driving policy to suit your environment.

This looks like an excellent way to get a taste of autonomous robots on a small budget, while still being a viable base for more demanding applications. We’ve seen only a few smartphone based robots like DriveMyPhone and SmartiPresense, which don’t have AI capabilities, but are intended for telepresence applications. We’ve always wondered why we don’t see more projects with cellphones, so we welcome the example.

Continue reading “Open Source Self-Driving Smartphone Robot”

Self-Driving RC Truck Is A Master’s Thesis In Cybernetics And Robotics

RC cars are a fun pastime, but for many hackers, taking things to the next level involves making the cars drive themselves. For his Masters thesis, [Jon] did just that, building a self-driving robot truck that confidently cruises the floor of his laboratory.

The truck is based on a 1/14th scale Tamiya chassis, and had been fitted out by a prior group with an inductive charging system. On top of this platform, [Jon] added a Jetson TX2 to act as the brains of the system, hooking it up with a Slamtec RPLIDAR scanner to map its surrounding environment. There’s also a Teensy microcontroller onboard which handles synthesizing PWM signals for the radio control hardware that drives the truck, and a Logitech webcam up front for machine vision. The truck is capable of operating in a variety of modes, from full manual operation, to driving based on LIDAR mapping or with an AI controlling the truck based on camera data. The truck is programmed to drive a route including an inductive charging pad so it can keep its power levels up without human intervention.

It’s a great blueprint for a self-driving system, and [Jon]’s thesis goes into great detail on how everything works at the base level (available on this page as a 67 MB PDF). His Code is on Github for the curious. We’ve seen similar projects before too, like this robot that navigates its builder’s house using LIDAR. Video after the break.

Continue reading “Self-Driving RC Truck Is A Master’s Thesis In Cybernetics And Robotics”