We’ve gotten to the point where a $35 Raspberry Pi can be a reasonable alternative to a traditional desktop or laptop, and microcontrollers in the Arduino ecosystem are getting powerful enough to handle some remarkably demanding computational jobs. But there’s still one area where microcontrollers seem to be lagging a bit: machine learning. Sure, there are purpose-built edge-computing SBCs, but wouldn’t it be great to be able to run AI models on versatile and ubiquitous MCUs that you can pick up for a couple of bucks?
We’re moving in that direction, and our friends at Adafruit Industries want to stop by the Hack Chat and tell us all about what they’re working on. In addition to Ladyada and PT, we’ll be joined by Meghna Natraj, Daniel Situnayake, and Pete Warden, all from the Google TensorFlow team. If you’ve got any interest in edge computing on small form-factor computers, you won’t want to miss this chat. Join us, ask your questions about TensorFlow Lite and TensorFlow Lite for Microcontrollers, and see what’s possible in machine learning way out on the edge.
Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.
A familiar spirit, or just a familiar, is a creature rumored to help people in the practice of magic. The moniker is perfect for Archimedes, the robot owl built by Alex Glow, which wields the Amazon Google AIY kit to react when it detects faces. A series of very interesting design choices a what really gives the creature life. Not all of those choices were on purpose, which is the core of her talk at the 2018 Hackaday Superconference.
You can watch the video of her talk, along with an interview with Alex after the break.
We released this on Tuesday and mentioned that the podcast was available on all major podcasting platforms. But turns out you need to wait for approval which we have since received.
This first installment covers some of our favorite trends, articles, and hacks from 2018. Get caught up on what you missed this year, and head over to the show notes for links to everything Mike and Elliot covered in the 67 minute walk down memory lane. We’ll be doing this more regularly in 2019 so pick your podcast aggregator and subscribe to Hackaday to keep up with new episodes.
Join us for the podcast, available on all major podcasting platforms, as Editors Mike Szczys and Elliot Williams attempt the impossible task of distilling the entire year into a one hour discussion. We’ve included every story mentioned in the podcast, and a few more, in the show notes here. But since we can’t possibly mention every awesome hack, we encourage you to share your favorites, and pat the writers on the back, by leaving a comment below.
Kudos and congratulations to all of the Hackaday writers and editors for an incredible year. Not a single day went by where we published fewer than eight articles, and that is a testament to the odd hours and quirky rabbit holes the Hackaday writing crew finds itself in. Equally huge kudos to the thousands of hackers out there who shared their work with us all! You’re all pushing the state of the art forward.
From the banks of levers and steam gauges of 1927’s Metropolis to the multicolored jewels that the crew would knowingly tap on in the original Star Trek, the entertainment industry has always struggled with producing imagery of advanced technology. Whether constrained by budget or imagination, portrayals usually go in one of two directions: they either rely too heavily on contemporary technology, or else they go so far in the opposite direction that it borders on comical.
But it doesn’t always have to be that way. In fact, when technology is shown properly in film it often serves as inspiration for engineers. The portrayal of facial recognition and gesture control in Minority Report was so well done that it’s still referenced today, nearly 20 years after the film’s release. For all its faults, Star Trek is responsible for a number of “life imitating art” creations; such as early mobile phones bearing an unmistakable resemblance to the flip communicators issued to Starfleet personnel.
So when I saw the exceptional use of 3D printing in the Netflix reboot of Lost in Space, I felt it was something that needed to be pointed out. From the way the crew made use of printed parts to the printer’s control interface, everything felt very real. It took existing technology and pushed it forward in a way that was impressive while still being believable. It was the kind of portrayal of technology that modern tech-savvy audiences deserve.
It left such an impression that we decided to reach out to Seth Molson, the artist behind the user interfaces from Lost in Space, and try to gain a little insight from somebody who is fighting the good fight for technology in media. To learn how he creates his interfaces, the pitfalls he navigates, and how the expectations of the viewer have changed now that we all have a touch screen supercomputer in our pocket.
Wiring is one of those things that we’ve all had to do on a project, but probably didn’t give a lot of thought to. It’s often the last thing that happens during the build, and almost certainly doesn’t get approached with any kind of foresight. You look at the components you need to connect, dig through the parts bins until you find something that looks like it should fit, and tack it in with a blob of solder and perhaps some hot glue if you’re feeling really fancy. We’re all guilty of it from time to time, but Bradley Gawthrop is here to tell you there’s a better way.
If you’re hoping his talk from the 2017 Hackaday Superconference contains “One crazy trick” for turning your normal rat’s nest of wiring into a harness worthy of the Space Shuttle, sorry to disappoint. Bradley acknowledges it takes some extra planning and a couple specialized tools, but the end results speak for themselves. While his talk is a must-watch for anyone looking to master the arcane arts of electron corralling, his post-talk chat with Elliot Williams after the break is a great primer for the how and why of everyone’s least favorite part of building their own hardware.
Bradley will be at Supercon again this year. It’s one anecdote for the concentration of awesome people you find at the event. We’re now just two seeks away so go get your ticket and then join us after the break for the interview.
Reverse engineering silicon is a dark art, and when you’re just starting off it’s best to stick to the lesser incantations, curses, and hexes. Hackaday caught up with Ken Shirriff at last year’s Supercon for a chat about the chip decapping and reverse engineering scene. His suggestion is to start with an old friend: the 555 timer.
Ken is well-known for his work photographing the silicon die at the heart of an Integrated Circuit (IC) and mapping out the structures to create a schematic of the circuit. We’re looking forward to Ken’s talk in just a few weeks at the Hackaday Superconference. Get a taste of it in the interview video below.