Hackaday Links: September 15, 2019

Hackaday Links Column Banner

It’s probably one of the first lessons learned by new drivers: if you see a big, red fire truck parked by the side of the road, don’t run into it. Such a lesson appears not to have been in the Tesla Autopilot’s driver education curriculum, though – a Tesla Model S managed to ram into the rear of a fire truck parked at the scene of an accident on a southern California freeway. Crash analysis reveals that the Tesla was on Autopilot and following another vehicle; the driver of the lead vehicle noticed the obstruction and changed lanes. Apparently the Tesla reacted to that by speeding up, but failed to notice the stationary fire truck. One would think that the person driving the car would have stepped in to control the vehicle, but alas. Aside from beating up on Tesla, whose AutoPilot feature seems intent on keeping the market for batteries from junked vehicles fully stocked, this just points out how far engineers have to go before self-driving vehicles are as safe as even the worst human drivers.

The tech press is abuzz today with stories about potential union-busting at Kickstarter. Back in March, Kickstarter employees announced their intent to organize under the Office and Professional Employees International Union (OPEIU). On Thursday, two of the union organizers were fired. Clarissa Redwine, who recently hosted a Hack Chat, was one of those released; both she and Taylor Moore are protesting their terminations as an illegal attempt to intimidate Kickstarter employees and keep them from voting for the union. For their part, Kickstarter management says that both employees and two more were released as a result of documented performance issues during the normal review cycle, and that fourteen employees who are in favor of the union were given raises during this cycle, with three of them having been promoted. There will no doubt be plenty more news about this to come.

Would you pay $900 for a Nixie clock? We wouldn’t, but if you choose to buy into Millclock’s high-end timepiece, it may help soften the blow if you think about it being an investment in the future of Nixie tubes. You see, Millclock isn’t just putting together an overpriced clock that uses surplus Russian Nixies – they’re actually making brand new tubes. Techmoan recently reviewed the new clock and learned that the ZIN18 tubes are not coming from Czech Republic-based Dalibor Farný, but rather are being manufactured in-house. That’s exciting news for Nixie builders everywhere; while Dalibor’s tubes are high-quality products, it can’t hurt to have a little competition in the market. Nixies as a growth industry in 2019 – who’da thunk it?

We ran across an interesting project on Hackaday.io the other day, one that qualifies as a true hack. How much house can you afford? A simple question, but the answer can be very difficult to arrive at with the certainty needed to sign papers that put you on the hook for the next 30 years. Mike Ferarra and his son decided to answer this question – in a circuit simulator? As it turns out, circuit simulators are great at solving the kinds of non-linear simultaneous equations needed to factor in principle, interest, insurance, taxes, wages, and a host of other inflows and outflows. Current sources represent money in, current sinks money paid out. Whatever is left is what you can afford. Is this how Kirchoff bought his house?

And finally, is your parts inventory a bit of a mystery? Nikhil Dabas decided that rather than trying to remember what he had and risk duplicating orders, he’d build an application to do it for him. Called WhatDidIBuy, it does exactly what you’d think; it scrapes the order history pages of sites like Adafruit, Digi-Key, and Mouser and compiles a list of your orders as CSV files. It’s only semi-automated, leaving the login process to the user, but something like this could save a ton of time. And it’s modular, so adding support for new suppliers is a simple as writing a new scraper. Forgot what you ordered from McMaster, eBay, or even Amazon? Now there’s an app for that.

28 thoughts on “Hackaday Links: September 15, 2019

  1. Gadzooks! Clarissa was an amazing host to all of us. She even managed to be surprised when I said I knew someone with her name, she’s a character in a series of books I own.

    Kickstarter needs to be, ah, poked at, for making a really bad decision like that.

  2. “…this just points out how far engineers have to go before self-driving vehicles are as safe as even the worst human drivers”

    Actually, I saw almost this same exact scenario about a year ago. A woman, driving in the right-hand lane of a two-lanes-each-way road, had driven directly into the back of a city maintenance truck that was parked with lots of flashing lights on. It seems she was (too) closely following some other car and she looked down at her cell phone “for just a second” when the car in front of her switched lanes to avoid the truck. She looked up and BANG!

    So, no, the Tesla autopilot is NOT worse than worst drivers – just sometimes equivalent to. :-)

    1. The statement about safety is a bit of a shame. As HaD itself reported that we do not have enough data to make that claim:
      https://hackaday.com/2016/12/05/self-driving-cars-are-not-yet-safe/

      And I think we’ve all seen our own fare share of human driving stupidity. I’ve seen a elderly lady drive a car into stopped bicycles at 15km/h. Just not stopping while going slow and not noticing the bikes in front of her. No real harm was done due to the low speed, but it looked really stupid.

      1. The bar for autonomous vehicles should definitely not be elderly drivers, drunk drivers, distracted drivers etc.

        It’s pretty hard to argue with “you see a big, red fire truck parked by the side of the road, don’t run into it”. Teslas would benefit from a Lidar but it’s not sexy so Elon would not allow it.

        1. Having one example where self driving vehicles don’t do too well doesn’t mean they’re less safe than humans. They’re likely to make different mistakes, but may very well have a better bottom line.

      2. That was almost three years ago, which is like forever in self-driving-car-years. Since then, there have been a number of high profile accidents, including fatalities, involving Teslas on Autopilot. But there have also been a _whole lot_ more miles driven.

        Unfortunately, Tesla is very secretive about their numbers. When the NHTSA asked them for data, they claimed that their safety records were “trade secrets”, and provided them data that were essentially crap and apparently intentionally obscured. And that’s when the gov’t asks.

        But yeah, they look to be roughly, almost, kinda equal with the US average. But this average includes new drivers, drunks, and other classes of people who are significantly less safe behind the wheel than I am. (Dunning Kreuger alert!)

        The problem with distinguishing between a non-moving billboard (no worries) and a non-moving firetruck (emergency braking) needs to be solved.

    1. – Yeah, because lots of average drivers rear-end parked firetrucks… They’re just lucky there wasn’t a fireman standing behind the truck also. Might have finally been the end of this autonomous car baloney. The average human driver can’t be the bar for a computer-based system. That’s like saying your PC processor should be able solve large numerical calculations correctly at least as often as you can to be good enough as far as I’m concerned. There is no room for ‘oh well, humans run over and kill other humans sometimes also,or plow into parked cars, we’re doing good enough…’ with autonomous cars. Humans are fallible unfortunately, but publicly exposed deadly machines should not have this margin of error. If you work with dangerous tools, it’s your choice. These are forced on everyone else in public, who are not signing up to beta-test an autonomous car’s driving skills. It’s horrible enough if a distracted driver runs over a kid in a crosswalk, but if one of these things takes a kid out, or a grandma, spouse, or anyone for that matter, I’d happily see the company sued out of existence by anyone involved. I can see a little bit of use case for this type of thing in long-haul repetitive route trucking, with human drivers taking first/last mile, but no way are we anywhere close to where a software system should be handling a vehicle around pedestrians. For personal use, it is just a luxury/convenience sales gimick that we really don’t need, as far as I’m concerned. Would I love to be driven to work some days? – Sure, that’d be awesome. But I don’t need it, and is not worth the consequences. Leave AI system to backing up a human if needed, at least in the foreseeable future. Then it’s ‘best of both worlds’.

  3. “…this just points out how far engineers have to go before self-driving vehicles are as safe as even the worst human drivers.”

    That’s an awfully loaded statement, and I’d like to see some comparison data or something peer reviewed. I understand that that might not be possible right now, given the sparsity of data available.

    1. I’ve been following Tesla closely for the past couple of years. As you suspect, there aren’t enough Tesla crashes to make a scientific determination right now, but preliminary information suggests that Teslas on autopilot are much safer than human drivers.

      One item of note: I recently read about how Tesla autopilot two years ago had problems recognizing when someone cuts in front of you, so they 1) told their fleet to send back recordings of when this happened, 2) Annotated the videos and data records, 3) Retrained their neural nets, and 4) Upgraded all existing Teslas with an OTA update.

      It’s an example of “fleet learning”. As they find problems in the algorithm, they can learn from their mistakes and make all cars better – something a human can’t do. (In the extreme case the human might die, meaning that they can’t learn from the experience. The Tesla fleet can.)

      Self driving doesn’t have to be perfect, it only has to be better than the average human driver. I suspect Tesla has surpassed this mark already, and is continuously improving.

      The other self-driving entries in the market are nowhere near Teslas’ level of sophistication.

      1. It is also important to make sure that the mistakes they do make be similar to human mistakes. That is, that they are things that a human can reasonably expect and anticipate.

        Anybody who drives builds up a model of what sorts of mistakes people make while driving. So we learn to anticipate those mistakes, and try to allow for them. (e.g., many bicyclists fail to stop at stop signs, so that is an error they make which I can try to allow for.)

        It is much harder to handle when a machine does something no human would do, since we can’t predict how they will respond.

      2. @PWalsh: What numbers you got?

        I periodically try to get anything reliable/useful on Tesla, and continually come up on a smooth wall. There’s nothing except what they want you to hear, and that’s not actually the data that would be useful to determine anything about safety.

        Given the marketing value of stats proving that Teslas are safer than human drivers, you would expect to hear them loudly trumpeted if they were.

        (And Teslas are _not_ self-driving cars. That is the sole reason that they did not face manslaughter charges in the various crashes that ended fatally.)

  4. I used to carry an inventory list of various appliances and power tools in my PDA.
    So, for example, if Sears had a lawn mower attachment or consumable on sale, I had access to the model # and could check if it was the right type of part. I also carried a spreadsheet of IC’s.
    Alas, the PDA battery died, and it turned out that those files were not part of the routine backup.

  5. Is there a similar tool, similar to this, for scraping information from grocery store purchases?
    Some groceries let you see the list of things you have purchased (if you use their member ID card). Could be interesting to gather that data over time and see what you actually eat, how much it costs, etc. (Why should they get all the fun analyzing your diet, etc.)

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.