Does Tesla’s Autosteer Make Cars Less Safe?

In 2016, a Tesla Model S T-boned a tractor trailer at full speed, killing its lone passenger instantly. It was running in Autosteer mode at the time, and neither the driver nor the car’s automatic braking system reacted before the crash. The US National Highway Traffic Safety Administration (NHTSA) investigated the incident, requested data from Tesla related to Autosteer safety, and eventually concluded that there wasn’t a safety-related defect in the vehicle’s design (PDF report).

But the NHTSA report went a step further. Based on the data that Tesla provided them, they noted that since the addition of Autosteer to Tesla’s confusingly named “Autopilot” suite of functions, the rate of crashes severe enough to deploy airbags declined by 40%. That’s a fantastic result.

Because it was so spectacular, a private company with a history of investigating automotive safety wanted to have a look at the data. The NHTSA refused because Tesla claimed that the data was a trade secret, so Quality Control Systems (QCS) filed a Freedom of Information Act lawsuit to get the data on which the report was based. Nearly two years later, QCS eventually won.

Looking into the data, QCS concluded that crashes may have actually increased by as much as 60% on the addition of Autosteer, or maybe not at all. Anyway, the data provided the NHTSA was not sufficient, and had bizarre omissions, and the NHTSA has since retracted their safety claim. How did this NHTSA one-eighty happen? Can we learn anything from the report? And how does this all align with Tesla’s claim of better-than-average safety line up? We’ll dig into the numbers below.

But if nothing else, Tesla’s dramatic reversal of fortune should highlight the need for transparency in the safety numbers of self-driving and other advanced car technologies, something we’ve been calling for for years now.

Continue reading “Does Tesla’s Autosteer Make Cars Less Safe?”

The Cat, The Aircraft, And The Tiny Computer

Sharing your life with a cat is a wonderful and fulfilling experience. Sharing your life with an awake, alert, and bored cat in the early hours when you are trying to sleep, is not. [Simon Aubury] has just this problem, as his cat [Snowy] is woken each morning by a jet passing over. In an attempt to identify the offending aircraft, he’s taken a Raspberry Pi and a software-defined radio, and attempted to isolate it by spotting its ADS-B beacon.

The SDR was the ubiquitous RTL chipset model, and it provided a continuous stream of aircraft data. To process this data he used an Apache Kafka stream processing server into which he also retrieved aircraft identifying data from an online service. Kafka’s SQL interface for interrogating multiple streams allowed him to untangle the mess of ADS-B returns and generate a meaningful feed of aircraft. This in turn was piped into an elasticsearch search engine database, upon which he built a Kibana visualisation.

The result was that any aircraft could be identified at a glance, and potential noise hotspots forecast. Whether all this heavy lifting was worth the end result is for you to decide, however it does provide an interesting introduction to the technologies and software involved. It is however possible to monitor ADS-B traffic considerably more simply.

Thanks [Oleg Anashkin] for the tip.

Adding Real Lenses To An Instant Camera

The Instax SQ6 and Fujifilm’s entire range of instant cameras are fun little boxes that produce instant photos. It’s a polaroid that’s not Polaroid, and like most instant cameras, the lenses are just one or two pieces of plastic. A lens transplant is in order, and that’s exactly what [Kevin] did to his Instax camera.

The key to this lens transplant project is to make it not look like a complete hack job. For this, [Kevin] is keeping the number of custom mechanical parts to a minimum, with just two pieces. There’s a lens shroud that screws down to the current flange on the camera’s plastic chassis, and should blend in perfectly with the rest of the camera. This demanded a significant amount of 3D modeling to get perfect. The other mechanical part is just a plastic disc with a hole in it. These parts were ordered from Shapeways and bolted to the camera with only a few problems regarding spacing and clearances. This didn’t prevent the camera from coming back together, which is when the documentation becomes fast and loose. Who could blame him: the idea of putting real lenses on an instant camera is something few can resist, and the pictures that come out of this modified camera look great.

The current state of the project with a single lens leads the camera to have an inaccurate and tunnel-like viewfinder, but a huge modification brings this project into twin-lens reflex territory. There are more modifications than camera here, but all the printed parts are documented, there are part numbers for McMaster-Carr, and the camera has full control over focusing and framing.

Safely Dive Into Your Fears With Virtual Reality

What makes you afraid? Not like jump-scares in movies or the rush of a roller-coaster, but what are your legitimate fears that qualify as phobias? Spiders? Clowns? Blood? Flying? Researchers at The University of Texas at Austin are experimenting with exposure therapy in virtual reality to help people manage their fears. For some phobias, like arachnophobia, the fear of spiders, this seems like a perfect fit. If you are certain that you are safely in a spider-free laboratory wearing a VR headset, and you see a giant spider crawling across your field of vision, the fear may be more manageable than being asked to put your hand into a populated spider tank.

After the experimental therapy, participants were asked to take the spider tank challenge. Subjects who were not shown VR spiders were less enthusiastic about keeping their hands in the tank. This is not definitive proof, but it is a promising start.

High-end VR equipment and homemade rigs are in the budget for many gamers and hackers, and our archives are an indication of how much the cutting-edge crowd loves immersive VR. We have been hacking 360 recording for nearly a decade, long before 360 cameras took their niche in the consumer market. Maybe when this concept is proven out a bit more, implementations will start appearing in our tip lines with hackers who helped their friends get over their fears.

Via IEEE Spectrum.

Photo by Wokandapix.

PrintRite Uses TensorFlow To Avoid Printing Catastrophies

TensorFlow is a popular machine learning package, that among other things, is particularly adept at image recognition. If you want to use a webcam to monitor cats on your lawn or alert you to visitors, TensorFlow can help you achieve this with a bunch of pre-baked libraries. [Eric] took a different tack with PrintRite – using TensorFlow to monitor his 3D printer and warn him of prints gone bad – or worse.

The project relies on training TensorFlow to recognize images of 3D prints gone bad. If layers are separated, or the nozzle is covered in melted goo, it’s probably a good idea to stop the print. Worst case, your printer could begin smoking or catch fire – in that case, [Eric] has the system configured to shut the printer off using a TP-Link Wi-Fi enabled power socket.

Currently, the project exists as a plugin for OctoPrint and relies on two Raspberry Pis – a Zero to handle the camera, and a 3B+to handle OctoPrint and the TensorFlow software. It’s in an early stage of development and is likely not quite ready to replace human supervision. Still, this is a project that holds a lot of promise, and we’re eager to see further development in this area.

There’s a lot of development happening to improve the reliability of 3D printers – we’ve even seen a trick device for resuming failed prints.

Hackaday Links Column Banner

Hackaday Links: March 3, 2019

In this week’s edition of, ‘why you should care that Behringer is cloning a bunch of vintage synths’, I present to you this amazing monstrosity. Yes, it’s a vertical video of a synthesizer without any sound. Never change, Reddit. A bit of explanation: this is four Behringer Model Ds (effectively clones of the Moog Minimoog, the Behringer version is called the ‘Boog’) stacked in a wooden case. They are connected to a MIDI keyboard ‘with Arduinos’ that split up the notes to each individual Boog. This is going to sound amazing and it’s one gigantic wall of twelve oscillators and it only cost $800 this is nuts.

Tuesday is Fastnacht day. Fill your face with fried dough.

The biggest news this week is the release of a ‘folding’ phone. This phone is expensive at about $3000 list, but keep in mind this is a flagship phone, one that defines fashion, and an obvious feature that will eventually be adopted by lower-cost models. Who knows what they’ll think of next.

It’s a new Project Binky! This time, we’re looking at cutting holes in the oil sump, patching those holes, cutting more holes in an oil sump, patching those holes, wiring up a dashcam, and putting in what is probably the third or fourth radiator so far.

Here’s a Kickstarter for new Nixie tubes. It’s a ZIN18, which I guess means an IN18, a tube with a 40mm tall set of numbers. This is the king of Nixie clocks, and one tube will run you about $100. Nah, you can also get new Nixies here.

The Sipeed K210 is a RISC-V chip with built-in neural networks. Why should you care? Because it’s RISC-V. It’s also pretty fast, reportedly 5 times as fast as the ESP32. This is a 3D rendering test of the K210, with all the relevant code on the Github.

I’m not sure if everyone is aware of this, but here’s the best way to desolder through-hole parts. Heat the solder joint up and whack it against a table. It never fails. Hitting things is the best way to make them do what you want.

Talk To Your ‘Scope, And It Will Obey

An oscilloscope is a device that many of us use, and which we often have to use while our hands are occupied with test probes or other tools. [James Wilson] has solved the problem of how to control his ‘scope no-handed, by connecting it to a Raspberry Pi 3 running the snips.ai voice assistant. This is an interesting piece of software that runs natively upon the device in contrast to the cloud service provided by the likes of Alexa or Google Assistant.

The ‘scope in question is a Keysight 1000-X that can be seen in the video below the break, but looking at the Python code we could imagine the same technique being brought to other instruments such as the Rigol 1054z we looked at controlling via USB a year or two ago. The use of the snips.ai software provides a pointer to how voice-controlled projects in our community might evolve beyond the cloud services, interestingly though they do not make a big thing of it their software appears to be open-source.

Oscilloscopes do not have to be remotely controlled by voice alone. It seems to be a common desire to take measurements no-handed — one project we’ve featured in the past did the job with a foot switch.

Continue reading “Talk To Your ‘Scope, And It Will Obey”