Door Springs And Neopixels Demonstrate Quantum Computing Principles

They may be out of style now, and something of a choking hazard for toddlers, but there’s no denying that spring doorstops make a great sound when they’re “plucked” by a foot as you walk by. Sure, maybe not on a 2:00 AM bathroom break when the rest of the house is sleeping, but certainly when used as sensors in this interactive light show.

The idea behind [Robin Baumgarten]’s “Quantum Garden” is clear from the first video below: engaging people through touch, sound, and light. Each of the 228 springs, surrounded by a Neopixel ring, is connected to one of the 12 inputs on an MPR121 capacitive touch sensor. The touch sensors and an accelerometer in the base detect which spring is sproinging and send that information to a pair of Teensies. A PC then runs the simulations that determine how the lights will react. The display is actually capable of some pretty complex responses, including full-on games. But the most interesting modes demonstrate principles of quantum computing, specifically stimulated Raman adiabatic passage (STIRAP), which describes transfers between quantum states. While the kids in the first video were a great stress test, the second video shows the display under less stimulation and gives a better idea of how it works.

We like this because it uses a simple mechanism of springs to demonstrate difficult quantum concepts in an engaging way. If you need more background on quantum computing, [Al Williams] has been covering the field for a while. Need the basics? Check out [Will Sweatman]’s primer.

Continue reading “Door Springs And Neopixels Demonstrate Quantum Computing Principles”

Send Smooches Over Skype With The Kiss Interface

This project of [Nathan]’s certainly has a playful straightforwardness about it. His Skype ‘Kiss’ Interface has a simple job: to try to create a more intuitive way to express affection within the limits of using Skype. It all came about from a long distance relationship for which the chat program was the main means of communicating. Seeking a more intuitive and personal means of expressing some basic affection, [Nathan] created a capacitive touch sensor that, when touched with the lips, sends the key combination for either a kissy face emoji or the red lips emoji, depending on the duration.

Capacitive touch sensing allows for triggering the sensor without actually physically touching one’s lips to the electrodes, which [Nathan] did by putting a clear plastic layer over the PCB traces. His board uses an STM32 microcontroller with software handling the USB HID and STM’s TSC (Touch Sensing Controller) functionality. As a result, the board has few components and a simple interface, which was in keeping with the goal of rejecting feature creep and focusing on a simple task.

Clearly the unit works; but how well does it actually fulfill its intended purpose? We don’t know that yet, but we do know that [Nathan] seems to have everything he needs in order to find out. Either way, it’s a fun project that definitely fits the spirit of the Human-Computer Interface Challenge of The Hackaday Prize.

A Low Cost, Dead Tree Touch Screen

Remember the “paperless office”? Neither do we, because despite the hype of end-to-end digital documents, it never really happened. The workplace is still a death-trap for trees, and with good reason: paper is cheap, literally growing on trees, and it’s the quickest and easiest medium for universal communication and collaboration. Trouble is, once you’re done scribbling your notes on a legal pad or designing the Next Big Thing on a napkin, what do you do with it?

If you’re anything like us, the answer to that question is misplacing or destroying the paper before getting a chance to procrastinate transcribing it into some useful digital form. Wouldn’t paper that automatically digitizes what you draw or write on it be so much better? That’s where this low-cost touch-sensitive paper (PDF link) is headed, and it looks like it has a lot of promise. Carnegie-Mellon researchers [Chris Harrison] and [Yang Zhang] have come up with cheap and easy methods of applying conductive elements to sheets of ordinary paper, and importantly, the methods can scale well to the paper mill to take advantage of economies of scale at the point of production. Based on silk-screened conductive paints, the digitizer uses electrical field tomography to locate touches and quantify their pressure through a connected microcontroller. The video below shows a prototype in action.

Current cost is 30 cents a sheet, and if it can be made even cheaper, the potential applications range from interactive educational worksheets to IoT newspapers. And maybe if it gets really cheap, you can make a touch-sensitive paper airplane when you’re done with it.

Continue reading “A Low Cost, Dead Tree Touch Screen”

This Radio Gets Pour Reception

When was the last time you poured water onto your radio to turn it on?

Designed collaboratively by [Tore Knudsen], [Simone Okholm Hansen] and [Victor Permild], Pour Reception seeks to challenge what constitutes an interface, and how elements of play can create a new experience for a relatively everyday object.

Lacking buttons or knobs of any kind, Pour Reception appears an inert acrylic box with two glasses resting on top. A detachable instruction card cues the need for water, and pouring some into the glasses wakes the radio.

Continue reading “This Radio Gets Pour Reception”

Custom Sensor Head Turns 3D Printer Into Capacitive Scanner

The best thing about owning a 3D printer or CNC router may not just be what you can additively or subtractively create with it. With a little imagination you can turn your machine into a 3D scanner, and using capacitive sensors to image items turns out to be an interesting project.

[Nelson]’s scanner idea came from fiddling with some capacitive sensors at work, and with a high-resolution capacitance-to-digital sensor chip in hand, he set about building a scan head for his printer. In differential mode, the FDC2212 sensor chip uses an external LC tank circuit with two plain sensor plates set close to each other. The sensor plates form an air-dielectric variable capacitor, and the presence of an object can be detected with high sensitivity. [Nelson]’s custom sensor board and controller ride on a 3D-printed bracket and scan over the target on the printer bed. Initial results were fuzzy, but after compensating for room temperature variations and doing a little filtering on the raw data, the scans were… still pretty fuzzy. But there’s an image there, and it’s something to work with.

Need a slightly more approachable project to get your feet wet with capacitive sensors? Maybe you should use your phone’s touchscreen as a 2D-capacitive scanner.

[via r/electronics]

Ask Hackaday: DIY Handwriting Recognition

Computer handwriting recognition is very cool by itself, and it’s something that we’d like to incorporate into a project. So we went digging for hacker solutions, and along the way came up with an interesting bit of history and some great algorithms. We feel like we’ve got a good start on that front, but we’re stuck on the hardware tablet sensor itself. So in this Ask Hackaday, we’re going to make the case for why you could be using a tablet-like device for capturing user input or doing handwriting recognition, and then we’re going to ask if you know of any good DIY tablet designs to make it work.

Continue reading “Ask Hackaday: DIY Handwriting Recognition”

Capacitive Imaging With A Raspberry Pi Touch Screen

We use touch screens all the time these days, and though we all know they support multiple touch events it is easy for us to take them for granted and forget that they are a rather accomplished sensor array in their own right.

[Optismon] has long held an interest in capacitive touch screen sensors, and has recently turned his attention to the official Raspberry Pi 7-inch touchscreen display. He set out to read its raw capacitance values, and ended up with a fully functional 2D capacitive imaging device able to sense hidden nails and woodwork in his drywall.

Reading the capacitance values is not a job for the faint-hearted though. There is an I2C bus which is handled by the Pi GPU rather than the processor, and to read it in software would require a change to the Pi’s infamous Broadcom binary blob. His solution which he agrees is non-optimal was to take another of the Pi’s I2C lines that he could talk to and connect it in parallel with the display line. As a result he can catch the readings from the screen’s sensors and with a bit of scripting make a 2D display on the screen. The outlines of hands and objects on his desk can clearly be seen when he places them on the screen, and when he runs the device over his wall it shows the position of the studding and nails behind the drywall.

He’s posted his code in a GitHub repository, and put up the YouTube video of his capacitive imaging in action which you can watch below the break.

Continue reading “Capacitive Imaging With A Raspberry Pi Touch Screen”