Laundry Monitor Won’t Generate Static With Roommates

Laundry. It’s one of life’s inescapable cycles, but at least we have machines now. The downside of this innovation is that since we no longer monitor every step — the rock-beating, the river-rinsing, the line-hanging and -retrieving — the pain of laundry has evolved into the monotony of monitoring the robots’ work.

[Adam] shares his wash-bots with roommates, and they aren’t close enough to combine their lights and darks and turn it into a group activity. They needed an easy way to tell when the machines are done running, and whose stuff is even in there in the first place, so [Adam] built a laundry machine monitor that uses current sensing to detect when the machines are done running and sends a text to the appropriate person.

Each machine has a little Hall effect-sensing module that’s carefully zip-tied around its power cable. The signal from these three-wire boards goes high when the machine is running and low when it’s not. At the beginning of the load, the launderer simply presses their assigned button on the control box, and the ESP32 inside takes care of the rest.

Getting a text when your drawers are clean is about as private as it gets. Clean underwear, don’t care? Put it on a scrolling marquee.

Piston-Powered Pellet Pusher For Peckish Pets

We all have our new and interesting challenges in lockdown life. If you’ve had to relocate to ride it out, the chances are good that even your challenges have challenges. Lockdown left [Kanoah]’s sister in the lurch when it came to feeding her recently-adopted pet rat, so he came up with a temporary solution to ensure that the rat never misses a meal.

Most of the automated pet feeders we see around here use an auger to move the food. That’s all fine and good, but if you just need to move a singular mass, the screw seems like overkill. [Kanoah]’s feeder is more akin to a pellet-pushing piston. It runs on a Metro Mini, but an Arduino Nano or anything with enough I/O pins would work just fine. The microcontroller starts counting the hours as soon as it has power, and delivers pellets four times a day with a servo-driven piston arm. [Kanoah] has all the files up on Thingiverse if you need a similar solution.

There many ways of solving the problem of dry pet food delivery. Wet food is a completely different animal, but as it turns out, not impossible to automate.

Foamboard Makes For A Light Hovercraft

If we are to believe many science fiction movies, one day throngs of people wearing skin-tight silver spandex jumpsuits will be riding around on hovercraft. Hovercraft haven’t really taken the world by storm, but [Fitim-Halimi] built his own model version and shows you how he did it. You can see the little craft moving in the video below.

In theory, a hovercraft is pretty simple, but in practice they are not as easy as they look. For one thing, you need a lot of air to fill the plenum chamber to get lift. That’s usually a noisy operation. The solution? In this case, a hairdryer gave up its motor for the cause. In addition, once floating on a near-frictionless cushion of air, you have to actually move without contacting the ground. Like many real hovercraft, this design uses another fan to push it along. You can see in the video that the designer uses Jedi hand motions to control the vehicle.

Continue reading “Foamboard Makes For A Light Hovercraft”

Alexa, Shoot Me Some Chocolate

[Harrison] has been busy finding the sweeter side of quarantine by building a voice-controlled, face-tracking M&M launcher. Not only does this carefully-designed candy launcher have control over the angle, direction, and velocity of its ammunition, it also locates and locks on to targets by itself.

Here comes the science: [Harrison] tricked Alexa into thinking the Raspberry Pi inside the machine is a smart TV named [Chocolate]. He just tells an Echo to increase the volume by however many candy-colored projectiles he wants launched at his face. Simply knowing the secret language isn’t enough, though. Thanks to a little face-based security, you pretty much have to be [Harrison] or his doppelgänger to get any candy.

The Pi takes a picture, looks for faces, and rotates the turret base in that direction using three servos driven by Arduino Nanos. Then the Pi does facial landmark detection to find the target’s mouth hole before calculating the perfect parabola and firing. As [Harrison] notes in the excellent build video below, this machine uses a flywheel driven by a DC motor instead of being spring-loaded. M&Ms travel a short distance from the chute and hit a flexible, spinning disc that flings them like a pitching machine.

We would understand if you didn’t want your face involved in a build with Alexa. It’s okay — you can still have a voice-controlled candy cannon.

Continue reading “Alexa, Shoot Me Some Chocolate”

Nightmare Fuel Telepresence ‘Bot May Become Your Last Friend

After this pandemic thing is all said and done, historians will look back on this period from many different perspectives. The one we’re most interested in of course will concern the creativity that flourished in the petri dish of anxiety, stress, and boredom that have come as unwanted side dishes to stay-at-home orders.

[Hunter Irving] and his brother were really missing their friends, so they held a very exclusive hackathon and built a terrifying telepresence robot that looks like a mash-up of Wilson from Castaway and that swirly-cheeked tricycle-riding thing from the Saw movies. Oh, and to make things even worse, it’s made of glow-in-the-dark PLA.

Now when they video chat with friends, TELEBOT is there to make it feel as though that person is in the room with them. The Arduino Uno behind its servo-manipulated vintage doll eyes uses the friend’s voice input to control the wind-up teeth based on their volume levels. As you might imagine, their friends had some uncanny valley issues with TELEBOT, so they printed a set of tiny hats that actually do kind of make it all better. Check out the build/demo video after the break if you think you can handle it.

Not creepy enough for you? Try building your own eyes from the ground up.

Continue reading “Nightmare Fuel Telepresence ‘Bot May Become Your Last Friend”

Touch-Typing On Fingertips? Prototype Says It Could Work

The fingertips are covered in touch sensors, each intended to be tapped by the thumbtip of the same hand.

Touch-typing with thumbs on a mobile phone keyboard is a pretty familiar way to input text, and that is part of what led to BiTipText, a method of allowing bimanual text input using fingertips. The idea is to treat the first segments of the index fingers as halves of a tiny keyboard, whose small imaginary keys are tapped with the thumbs. The prototype shown here was created to see how well the concept could work.

The prototype hardware uses touch sensors that can detect tap position with a high degree of accuracy, but the software side is where the real magic happens. Instead of hardcoding a QWERTY layout and training people to use it, the team instead ran tests to understand users’ natural expectations of which keys should be on which finger, and how exactly they should be laid out. This data led to an optimized layout, and when combined with predictive features, test participants could achieve an average text entry speed of 23.4 words per minute.

Judging by the prototype hardware, it’s understandable if one thinks the idea of fingertip keyboards may be a bit ahead of its time. But considering the increasingly “always on, always with you” nature of personal technology, the goal of the project was more about investigating ways for users to provide input in fast and subtle ways. It seems that the idea has some merit in principle. The project’s paper can be viewed online, and the video demonstration is embedded below.

Continue reading “Touch-Typing On Fingertips? Prototype Says It Could Work”

Open Laser Blaster Shells Out More Bang For The Buck

[a-RN-au-D] was looking for something fun to do with his son and dreamed up a laser blaster game that ought to put him in the running for father of the year. It was originally just going to be made of cardboard, but you know how these things go. We’re happy the design went this far, because that blaster looks fantastic.

Both the blaster and the target run on Arduino Nanos. There’s a 5mW laser module in the blaster, and a speaker for playing the pew pew-related sounds of your choice. Fire away on the blaster button, and the laser hits a light-dependent resistor mounted in the middle of the target. When the target registers a hit, it swings backward on a 9g servo and then returns quickly to vertical for the next shot.

There are some less obvious features that really make this game a hit. The blaster can run in 10-shooter mode (or 6, or whatever you change it to in the code) with a built-in reload delay, or it can be set to fully automatic. If you’re short on space or just get sick of moving the target to different flat surfaces, it can be mounted on the wall instead — the target moves forward when hit and then resets back to flat. Check out the demo video we loaded up after the break.

No printer? No problem — here’s a Node-RED shooting gallery that uses simple wooden targets.

Continue reading “Open Laser Blaster Shells Out More Bang For The Buck”