This Negative Reinforcement Keyboard May Shock You

We wouldn’t be where we are today without Mrs. Coldiron’s middle school typing class. Even though she may have wanted to, she never did use negative reinforcement to improve our typing speed or technique. We unruly teenagers might have learned to type a lot faster if those IBM Selectrics had been wired up for discipline like [3DPrintedLife]’s terrifying, tingle-inducing typist trainer keyboard (YouTube, embedded below).

This keyboard uses capsense modules and a neural network to detect whether the user is touch-typing or just hunting and pecking. If you’re doing it wrong, you’ll get a shock from the guts of a prank shock pen every time you peck the T or Y keys. Oh, and just for fun, there’s a 20 V LED bar across the top that is supposed to deter you from looking down at your hands with randomized and blindingly bright strobing light.

Twenty-four of the keys are connected in groups of three by finger usage — for example Q, A, and Z are wired to the same capsense module. These are all wired up to a Raspberry Pi Zero along with the light bar. [3DPrintedLife] was getting a lot of cross-talk between capsense modules, so they solved the problem in software by training a TensorFlow model with a ton of both proper and improper typing data.

We love the little meter on the touchscreen that shows at a glance how you’re doing in the touch typing department. As the meter inches leftward, you know you’re in for a shock. [3DPrintedLife] even built in some games that use pain to promote faster and more accurate typing. Check out the build video after the break, but don’t say we didn’t warn you about the strobing lights.

The secret to the shock pen is a tiny flyback transformer like the kind used in CRT televisions. Find a full-sized flyback transformer and you can build yourself a handheld high-voltage power supply.

Continue reading “This Negative Reinforcement Keyboard May Shock You”

Reachy The Open Source Robot Says Bonjour

Humanoid robots always attract attention, but anyone who tries to build one quickly learns respect for a form factor we take for granted because we were born with it. Pollen Robotics wants to help move the field forward with Reachy: a robot platform available both as a product and as a wealth of information shared online.

This French team has released open source robots before. We’ve looked at their Poppy robot and see a strong family resemblance with Reachy. Poppy was a very ambitious design with both arms and legs, but it could only ever walk with assistance. In contrast Reachy focuses on just the upper body. One of the most interesting innovations is found in Reachy’s neck, a cleverly designed 3 DOF mechanism they called Orbita. Combined with two moving antennae at the top of the head, Reachy can emote a wide range of expressions despite not having much of a face. The remainder of Reachy’s joints are articulated with Dynamixel serial bus servos though we see an optional Orbita-based hand attachment in the demo video (embedded below).

Reachy’s € 19,990 price tag may be affordable relative to industrial robots, but it’s pretty steep for the home hacker. No need to fret, those of us with smaller bank accounts can still join the fun because Pollen Robotics has open sourced a lot of Reachy details. Digging into this information, we see Reachy has a Google Coral for accelerating TensorFlow and a Raspberry Pi 4 for general computation. Mechanical designs are released via web-based Onshape CAD. Reachy’s software suite on GitHub is primarily focused on Python, which allows us to experiment within a Jupyter notebook. Simulation can be done within Unity 3D game engine, which can be optionally compiled to run in a browser like the simulation playground. But academic robotics researchers are not excluded from the fun, as ROS1 integration is also available though ROS2 support is still on the to-do list.

Reachy might not be as sophisticated as some humanoid designs we’ve seen, and without a lower body there’s no way for it to dance. But we are very appreciative of a company willing to share knowledge with the world. May it spark new ideas for the future.

[via Engadget]

Continue reading “Reachy The Open Source Robot Says Bonjour”

Into The Belly Of The Beast With Placemon

No, no, at first we thought it was a Pokemon too, but Placemon monitors your place, your home, your domicile. Instead of a purpose-built device, like a CO detector or a burglar alarm, this is a generalized monitor that streams data to a central processor where machine learning algorithms notify you if something is awry. In a way, it is like a guard dog who texts you if your place is unusually cold, on fire, unlawfully occupied, or underwater.

[anfractuosity] is trying to make a hacker-friendly version based on inspiration from a scientific paper about general-purpose sensing, which will have less expensive components but will lose accuracy. For example, the article suggests thermopile arrays, like low-resolution heat-vision, but Placemon will have a thermometer, which seems like a prudent starting place.

The PCB is ready to start collecting sound, temperature, humidity, barometric pressure, illumination, and passive IR then report that telemetry via an onboard ESP32 using Wifi. A box utilizing Tensorflow receives the data from any number of locations and is training to recognize a few everyday household events’ sensor signatures. Training starts with events that are easy to repeat, like kitchen sounds and appliance operations. From there, [anfractuosity] hopes that he will be versed enough to teach it new sounds, so if a pet gets added to the mix, it doesn’t assume there is an avalanche every time Fluffy needs to go to the bathroom.

We have another outstanding example of sensing household events without directly interfacing with an appliance, and bringing a sensor suite to your car might be up your alley.

OPARP Telepresence Robot

[Erik Knutsson] is stuck inside with a bunch of robot parts, and we know what lies down that path. His Open Personal Assistant Robotic Platform aims to help out around the house with things like filling pet food bowls, but for now, he is taking one step at a time and working out the bugs before adding new features. Wise.

The build started with a narrow base, an underpowered RasPi, and a quiet speaker, but those were upgraded in turn. Right now, it is a personal assistant on wheels. Alexa was the first contender, but Mycroft is in the spotlight because it has more versatility. At first, the mobility was a humble web server with a D-pad, but now it leverages a distance sensor and vision, and can even follow you with a voice command.

The screen up top gives it a personable look, but it is slated to become a display for everything you’d want to see on your robot assistant, like weather, recipes, or a video chat that can walk around with you. [Erik] would like to make something that assists the elderly who might need help with chores and help connect people who are stuck inside like him.

Expressive robots have long since captured our attention and we’re nuts for privacy-centric personal assistants.

Continue reading “OPARP Telepresence Robot”

Generate Positivity With Machine Learning

Gesture recognition and machine learning are getting a lot of air time these days, as people understand them more and begin to develop methods to implement them on many different platforms. Of course this allows easier access to people who can make use of the new tools beyond strictly academic or business environments. For example, rollerblading down the streets of Atlanta with a gesture-recognizing, streaming TV that [nate.damen] wears over his head.

He’s known as [atltvhead] and the TV he wears has a functional LED screen on the front. The whole setup reminds us a little of Deep Thought. The screen can display various animations which are controlled through Twitch chat as he streams his journeys around town. He wanted to add a little more interaction to the animations though and simplify his user interface, so he set up a gesture-sensing sleeve which can augment the animations based on how he’s moving his arm. He uses an Arduino in the arm sensor as well as a Raspberry Pi in the backpack to tie it all together, and he goes deep in the weeds explaining how to use Tensorflow to recognize the gestures. The video linked below shows a lot of his training runs for the machine learning system he used as well.

[nate.damen] didn’t stop at the cheerful TV head either. He also wears a backpack that displays uplifting messages to people as he passes them by on his rollerblades, not wanting to leave out those who don’t get to see him coming. We think this is a great uplifting project, and the amount of work that went into getting the gesture recognition machine learning algorithm right is impressive on its own. If you’re new to Tensorflow, though, we have featured some projects that can do reliable object recognition using little more than a Raspberry Pi and a camera.

Continue reading “Generate Positivity With Machine Learning”

Autonomous Sentry Gun Packs A Punch And A Ton Of Build Tips

What has dual compressed-air cannons, 500 roll-on deodorant balls, and a machine-learning brain with a bad attitude? We didn’t know either, until [Leo Fernekes] dropped this video on his autonomous robot sentry gun and saw it in action for ourselves.

Now, we’ve seen tons of sentry guns on these pages before, shooting everything from water to various forms of Nerf. And plenty of those builds have used some form of machine vision to aim the gun onto the target. So while it might appear that [Leo]’s plowing old ground here, this build is chock full of interesting tips and tricks.

It started when [Leo] saw a video on TensorFlow basics from our friend [Edje Electronics], which gave him the boost needed to jump into an AI project. The controller he ended up with looks for humans in the scene and slews the turret onto target, where the air cannons can do their thing. The hefty ammo is propelled by compressed air, which is dumped into the chamber using a solenoid valve with an interesting driver that maximizes the speed at which it opens. Style points go to the bacteriophage T4-inspired design, and to the sequence starting at 1:34 which reminded us of the factory scene from RoboCop.

[Leo] really put a ton of work into this project, and the results show. He is hoping to get an art gallery or museum to show it as an interactive piece to comment on one possible robot-human future, presumably after getting guests to sign a release. Whatever happens to it, the robot looks great and [Leo] learned a lot from it, as did we.

Continue reading “Autonomous Sentry Gun Packs A Punch And A Ton Of Build Tips”

The Smallest Cell Phone Picture

Mobile phones are the photography tool for most of us, but they are a blunt tool. If you love astrophotography, you buy a DSLR and a lens adapter. Infrared photography needs camera surgery or a special unit. If you want to look closer to home, you may have a microscope with a CCD. Your pocket computer is not manufactured for microscopy, but that does not mean it cannot be convinced. Most of us have held our lens up to the eyepiece of some binoculars or a microscope, and it sort of works, but it is far from perfect. [Benedict Diederich] and a team are proving that we can get darn beautiful images with a microscope, a phone holder, and some purpose-built software on an Android phone with their cellSTORM.

The trick to getting useful images is to compare a series of pictures and figure out which pixels matter and which ones are noisy. Imagine someone shows you grainy nighttime footage from an outdoor security camera. When you pause, it looks like hot garbage, and you can’t tell the difference between a patio chair and a shrubbery. As it plays, the noisy pixels bounce around, and you figure out you’re looking at a spruce bush, and that is roughly how the software parses out a crisp image. At the cost of frame rate, you get clarity, which is why you need a phone holder. Some of their tests took minutes, so astrophotography might not fare as well.

We love high-resolution pictures of tiny things and that isn’t going to change anytime soon.

Thank you [Dr. Nicolás De Francesco] for the tip.