The door-unlocking mechanism, featuring a 3D printed bevel gear and NEMA 17 stepper.

Hack Swaps Keys For Gang Signs, Everyone Gets In

How many times do you have to forget your keys before you start hacking on the problem? For [Binh], the answer was 5 in the last month, and his hack was to make a gesture-based door unlocker. Which leads to the amusing image of [Binh] in a hallway throwing gang signs until he is let in.

The system itself is fairly simple in its execution: the existing deadbolt is actuated by a NEMA 17 stepper turning a 3D printed bevel gear. It runs 50 steps to lock or unlock, apparently, then the motor turns off, so it’s power-efficient and won’t burn down [Binh]’s room.

The software is equally simple; mediapipe is an ML library that can already do finger detection and be accessed via Python. Apparently gesture recognition is fairly unreliable, so [Binh] just has it counting the number of fingers flashed right now. In this case, it’s running on a Rasberry Pi 5 with a webcam for image input. The Pi connects via USB serial to an ESP32 that is connected to the stepper driver. [Binh] had another project ready to be taken apart that had the ESP32/stepper combo ready to go so this was the quickest option. As was mounting everything with double-sided tape, but that also plays into a design constraint: it’s not [Binh]’s door.

[Binh] is staying in a Hacker Hotel, and as you might imagine, there’s been more penetration testing on this than you might get elsewhere. It turns out it’s relatively straightforward to brute force (as you might expect, given it is only counting fingers), so [Binh] is planning on implementing some kind of 2FA. Perhaps a secret knock? Of course he could use his phone, but what’s the fun in that?

Whatever the second factor is, hopefully it’s something that cannot be forgotten in the room. If this project tickles your fancy, it’s open source on GitHub, and you can check it out in action and the build process in the video embedded below.

After offering thanks to [Binh] for the tip, the remaining words of this article will be spent requesting that you, the brilliant and learned hackaday audience, provide us with additional tips.

Continue reading “Hack Swaps Keys For Gang Signs, Everyone Gets In”

DIY 3D Hand Controller Using A Webcam And Scripting

Are you ready to elevate your interactive possibilities without breaking the bank? If so, explore [Caio Bassetti]’s tutorial on creating a full 3D hand controller using only a webcam, MediaPipe Hands, and Three.js. This hack lets you transform a 2D screen into a fully interactive 3D scene—all with your hand movements. If you’re passionate about low-cost, accessible tech, try this yourself – not much else is needed but a webcam and a browser!

The magic of the project lies in using MediaPipe Hands to track key points on your hand, such as the middle finger and wrist, to calculate depth and positioning. Using clever Three.js tricks, the elements can be controlled on a 3D axis. This setup creates a responsive virtual controller, interpreting hand gestures for intuitive movement in the 3D space. The hack also implements a closed-fist gesture to grab and drag objects and detects collisions to add interactivity. It’s a simple, practical build and it performs reliably in most browsers.

For more on this innovation or other exciting DIY hand-tracking projects, browse our archive on gesture control projects, or check out the full article on Codrops. With tools such as MediaPipe and Three.js, turning ideas into reality gets more accessible than ever.

Machine Learning Baby Monitor, Part 2: Learning Sleep Patterns

The first lesson a new parent learns is that the second you think you’ve finally figured out your kid’s patterns — sleeping, eating, pooping, crying endlessly in the middle of the night for no apparent reason, whatever — the kid will change it. It’s the Uncertainty Principle of kids — the mere act of observing the pattern changes it, and you’re back at square one.

As immutable as this rule seems, [Caleb Olson] is convinced he can work around it with this over-engineered sleep pattern tracker. You may recall [Caleb]’s earlier attempts to automate certain aspects of parenthood, like this machine learning system to predict when baby is hungry; and yes, he’s also strangely obsessed with automating his dog’s bathroom habits. All that preliminary work put [Caleb] in a good position to analyze his son’s sleep patterns, which he did with the feed from their baby monitor camera and Google’s MediaPipe library.

This lets him look for how much the baby’s eyes are open, calculate with a wakefulness probability, and record the time he wakes up. This worked great right up until the wave function collapsed the baby suddenly started sleeping on his side, requiring the addition of a general motion detection function to compensate for the missing eyeball data. Check out the video below for more details, although the less said about the screaming, demon-possessed owl, the better.

The data [Caleb] has collected has helped him and his wife understand the little fellow’s sleep needs and fine-tune his cycles. There’s a web app, of course, and a really nice graphical representation of total time asleep and awake. No word on naps not taken in view of the camera, though — naps in the car are an absolute godsend for many parents. We suppose that could be curated manually, but wouldn’t doubt it if [Caleb] had a plan to cover that too.

Continue reading “Machine Learning Baby Monitor, Part 2: Learning Sleep Patterns”

Machine Learning Baby Monitor Prevents The Hunger Games

Newborn babies can be tricky to figure out, especially for first-time parents. Despite the abundance of unsolicited advice proffered by anyone who ever had a baby before — and many who haven’t — most new parents quickly get in sync with the baby’s often ambiguous signals. But [Caleb] took his observations of his newborn a step further and built a machine-learning hungry baby early warning system that’s pretty slick.

Normally, babies are pretty unsubtle about being hungry, and loudly announce their needs to the world. But it turns out that crying is a lagging indicator of hunger, and that there are a host of face, head, and hand cues that precede the wailing. [Caleb] based his system on Google’s MediaPipe library, using his baby monitor’s camera to track such behaviors as lip smacking, pacifier rejection, fist mouthing, and rooting, all signs that someone’s tummy needs filling. By putting together a system to recognize these cues and assign a weight to them, [Caleb] now gets a text before the baby gets to the screaming phase, to the benefit of not only the little nipper but to his sleep-deprived servants as well. The video below has some priceless bits in it; don’t miss [Baby Caleb] at 5:11 or the hilarious automatic feeder gag at the end.

We’ve seen some interesting videos from [Caleb] recently, mostly having to do with his dog’s bathroom habits and getting help cleaning up afterward. We can only guess how those projects will be leveraged when this kid gets a little older and starts potty training.

Continue reading “Machine Learning Baby Monitor Prevents The Hunger Games”