Secret Messages On Plastic, Just Add Tesla Coil

Here’s a short research paper from 2013 that explains how to create “hydroglyphics”, or writing with selecting surface wetting. In it, an apparently normal-looking petri dish is treated so as to reveal a message when wetted with water vapor. The contrast between hydrophobic and hydrophilic surfaces, which is not visible to the naked eye, becomes visible when misted with water. All it took was a mask, and a little treatment with a modified Tesla coil.

Plastics tend to be hydrophobic, meaning their surface repels water. These plastics also tend to be non-receptive to things like inks and adhesives. However, there is an industrial process called corona treatment (invented by Verner Eisby in 1951) that changes the surface energy of materials like plastics, rendering them more receptive to inks, coatings, and adhesives. Eisby’s company Vetaphone still exists today, and has a page describing the process.

What’s this got to do with the petri dishes and their secret messages? The process is essentially the same. By using a Tesla coil modified with a metal wire mesh, the surface of the petri dish is exposed to the coil’s discharge, altering its surface energy and rendering it hydrophilic. By selectively blocking the discharge with a nonconductive mask made from a foam sticker, the masked area remains hydrophobic. Mist the surface with water, and the design becomes visible.

The effects of corona treatment decay over time, but we think this is exactly the sort of thing that is worth keeping in mind just in case it ever comes in useful. Compact Tesla coils are fairly easy to get a hold of nowadays, but it’s also possible to make your own.

AI Image Generator Twists In Response To MIDI Dials, In Real-time

MIDI isn’t just about music, as [Johannes Stelzer] shows by using dials to adjust AI-generated imagery in real-time. The results are wild, with an interactivity to them that we don’t normally see in such things.

[Johannes] uses Stable Diffusion‘s SDXL Turbo to create a baseline image of “photo of a red brick house, blue sky”. The hardware dials act as manual controls for applying different embeddings to this baseline, such as “coral”, “moss”, “fire”, “ice”, “sand”, “rusty steel” and “cookie”.

By adjusting the dials, those embeddings are applied to the base image in varying strengths. The results are generated on the fly and are pretty neat to see, especially since there is no appreciable amount of processing time required.

The MIDI controller is integrated with the help of lunar_tools, a software toolkit on GitHub to facilitate creating interactive exhibits. As for the image end of things, we’ve previously covered how AI image generators work.

Re-imagining Telepresence With Humanoid Robots And VR Headsets

Don’t let the name of the Open-TeleVision project fool you; it’s a framework for improving telepresence and making robotic teleoperation far more intuitive than it otherwise would be. It accomplishes this in part by taking advantage of the remarkable technology packed into modern VR headsets like the Apple Vision Pro and Meta Quest. There are loads of videos on the project page, many of which demonstrate successful teleoperation across vast distances.

Teleoperation of robotic effectors typically takes some getting used to. The camera views are unusual, the limbs don’t move the same way arms do, and intuitive human things like looking around to get a sense of where everything is don’t translate well.

A stereo camera with gimbal streaming to a VR headset complete with head tracking seems like a very hackable design.

To address this, researches provided a user with a robot-mounted, real-time stereo video stream (through which the user can turn their head and look around normally) as well as mapping arm and hand movements to humanoid robotic counterparts. This provides the feedback to manipulate objects and perform tasks in a much more intuitive way. In short, when our eyes, bodies, and hands look and work more or less the way we expect, it turns out it’s far easier to perform tasks.

The research paper goes into detail about the different systems, but in essence, a stereo depth and RGB camera is perched with a 3D printed gimbal atop a humanoid robot frame like the Unitree H1 equipped with high dexterity hands. A VR headset takes care of displaying a real-time stereoscopic video stream and letting the user look around. Hand tracking for the user is mapped to the dexterous hands and fingers. This lets a person look at, manipulate, and handle things without in-depth training. Perhaps slower and more clumsily than they would like, but in an intuitive way all the same.

Interested in taking a closer look? The GitHub repository has the necessary code, and while most of us will never be mashing ADD TO CART on something like the Unitree H1, the reference design for a stereo camera streaming to a VR headset and mirroring head tracking with a two-motor gimbal looks like the sort of thing that would be useful for a telepresence project or two.

Continue reading “Re-imagining Telepresence With Humanoid Robots And VR Headsets”

Read Utility Meters Via SDR To Fill Out Smart Home Stats

[Jeff Sandberg] has put a fair bit of effort into adding solar and battery storage with associated smarts to his home, but his energy usage statistics were incomplete. His solution was to read data from the utility meter using RTL-SDR to fill in the blanks. The results are good so far, and there’s no reason similar readings for gas and water can’t also be done.

[Jeff] uses the open source home automation software Home Assistant which integrates nicely with his solar and battery backup system, but due to the way his house is wired, it’s only aware of about half of the energy usage in the house. For example, [Jeff]’s heavy appliances get their power directly from the power company and are not part of the solar and battery systems. This means that Home Assistant’s energy statistics are incomplete.

Fortunately, in the USA most smart meters broadcast their data in a manner that an economical software-defined radio like RTL-SDR can access. That provided [Jeff] with the data he needed to get a much more complete picture of his energy usage.

While getting data from utility meters is conceptually straightforward, actually implementing things in a way that integrated with his system took a bit more work. If you’re finding yourself in the same boat, be sure to look at [Jeff]’s documentation to get some ideas.

2024 Business Card Challenge: CardTunes Bluetooth Speaker

A business card form factor can be quite limiting, but that didn’t stop [Schwimmflugel] from creating CardTunes, an ESP32-based Bluetooth audio speaker that tried something innovative to deliver the output.

What’s very interesting about this design is the speaker itself. [Schwimmflugel] aimed to create a speaker out of two coils made from flexible circuit board material, driving them with opposite polarities to create a thin speaker without the need for a permanent magnet.

The concept is sound, but in practice, performance was poor. One could identify the song being played, but only if holding the speaker up to one’s ear. The output was improved considerably with the addition of a small permanent magnet behind the card, but of course this compromised the original vision.

Even though the concept of making a speaker from two flexible PCB panel coils had only mixed success, we love seeing this kind of effort and there’s a lot to learn from the results. Not to mention that it’s frankly fantastic to even have a Bluetooth speaker on a business card in the first place.

The 2024 Business Card Challenge is over, but judging by all the incredible entries we received, we’re thinking it probably won’t be too long before we come up with another sized-constrained challenge.

Continue reading “2024 Business Card Challenge: CardTunes Bluetooth Speaker”

Sealed Packs Of Pokémon Cards Give Up Their Secrets Without Opening Them

[Ahron Wayne] succeeded in something he’s been trying to accomplish for some time: figuring out what’s inside a sealed Pokémon card packet without opening it. There’s a catch, however. It took buying an X-ray CT scanner off eBay, refurbishing and calibrating it, then putting a load of work into testing and scanning techniques. Then finally combining the data with machine learning in order to make useful decisions. It’s a load of work but [Ahron] succeeded by developing some genuinely novel techniques.

While using an X-ray machine to peek inside a sealed package seems conceptually straightforward, there are in fact all kinds of challenges in actually pulling it off.  There’s loads of noise. So much that the resulting images give a human eyeball very little to work with. Luckily, there are also some things that make the job a little easier.

For example, it’s not actually necessary to image an entire card in order to positively identify it. Teasing out the individual features such as a fist, a tentacle, or a symbol are all useful to eliminate possibilities. Interestingly, as a side effect the system can easily spot counterfeit cards; the scans show up completely different.

When we first covered [Ahron]’s fascinating journey of bringing CT scanners back to life, he was able to scan cards but made it clear he wasn’t able to scan sealed packages. We’re delighted that he ultimately succeeded, and also documented the process. Check it out in the video below.

Continue reading “Sealed Packs Of Pokémon Cards Give Up Their Secrets Without Opening Them”

Robot Seeks And Sucks Up Cigarette Butts, With Its Feet

It would be better if humans didn’t toss cigarette butts on the ground in the first place, but change always takes longer than we think it should. In the meantime, researchers at the Italian Institute of Technology have used the problem as an opportunity to explore what seems to be a novel approach: attaching vacuum pickups to a robot’s feet, therefore removing the need for separate effectors.

VERO (Vacuum-cleaner Equipped RObot) is a robotic dog with a vacuum cleaner “backpack” and four hoses, one going down each leg. A vision system detects a cigarette butt, then ensures the robot plants a foot next to it, sucking it up. The research paper has more details, but the video embedded below gives an excellent overview.

While VERO needs to think carefully about route planning, using the legs as effectors is very efficient. Being a legged robot, VERO can navigate all kinds of real-world environments — including stairs — which is important because cigarette butts know no bounds.

Also, using the legs as effectors means there is no need for the robot to stop and wait while a separate device (like an arm with a vacuum pickup) picks up the trash. By simply planting a foot next to a detected cigarette butt, VERO combines locomotion with pickup.

It’s fascinating to see how the Mini Cheetah design has really become mainstream to the point that these robots are available off-the-shelf, and it’s even cooler to see them put to use. After all, robots tackling trash is a good way to leverage machines that can focus on specific jobs, even if they aren’t super fast at it.

Continue reading “Robot Seeks And Sucks Up Cigarette Butts, With Its Feet”