How Low-Power Can You Go?

[lasersaber] has a passion: low-power motors. In a bid to challenge himself and inspired by betavoltaic cells, he has 3D printed and built a small nuclear powered motor!

This photovoltaic battery uses fragile glass vials of tritium extracted from keychains and a small section of a solar panel to absorb the light, generating power. After experimenting with numerous designs, [lasersaber] went with a 3D printed pyramid that houses six coils and three magnets, encapsulated in a glass cloche and accompanied by a suitably ominous green glow.

Can you guess how much power and current are coursing through this thing? Guess again. Lower. Lower.

Under 200mV and 20nA!

Continue reading “How Low-Power Can You Go?”

A Compact, Portable Pantograph Camera Slider

Ho, hum, another camera slider, right? Wrong — here’s a camera slider with a literal twist.

What sets [Schijvenaars]’ slider apart from the pack is that it’s not a slider, at least not in the usual sense. A slider is a mechanical contrivance that allows a camera to pan smoothly during a shot. Given that the object is to get a camera from point A to point B as smoothly as possible, and that sliders are often used for long exposures or time-lapse shots, the natural foundation for them is a ball-bearing linear slide, often powered by a stepper motor on a lead screw. [Schijvenaars] wanted his slider to be more compact and therefore more portable, so he designed and 3D-printed a 3-axis pantograph mechanism. The video below shows the slider panning the camera through a silky smooth 60 centimeters; a bonus of the arrangement is that it can transition from panning in one direction to the other without any jerking. Try that with a linear slider.

Granted, this slider is not powered, but given that the axes are synced with timing belts, it wouldn’t be difficult to add a motor. We’ve seen a lot of sliders before, from simple wooden units to complicated overhead cranes, but this one seems like a great design with a lot of possibilities.

Continue reading “A Compact, Portable Pantograph Camera Slider”

Open Source Barbot Needs Only Two Motors

Most drinkbots are complicated—some intentionally so, others seemingly by design necessity. If you have a bunch of bottles of booze you still need a way to get it out of the bottles in a controlled fashion, usually through motorized pouring or pumping. Sometimes all thoe tubes and motors and wires looks really cool and sci fi. Still, there’s nothing wrong with a really clean design.

[Lukas Šidlauskas’s] Open Source Barbot project uses only two motors to actuate nine bottles using only a NEMA-17 stepper to move the tray down along the length of the console and a high-torque servo to trigger the Beaumont Metrix SL spirit measures. These barman’s bottle toppers dispense 50 ml when the button is pressed, making them (along with gravity) the perfect way to elegantly manage so many bottles. Drink selection takes place on an app, connected via Bluetooth to the Arduino Mega running the show.

The Barbot is an Open Source project with project files available from [Lukas]’s GitHub repository
and discussions taking place in a Slack group.

If it’s barbots you’re after, check out this Synergizer 4-drink barbot and the web-connected barbot we published a while back.

Continue reading “Open Source Barbot Needs Only Two Motors”

AI Watches You Sleep; Knows When You Dream

If you’ve never been a patient at a sleep laboratory, monitoring a person as they sleep is an involved process of wires, sensors, and discomfort. Seeking a better method, MIT researchers — led by [Dina Katabi] and in collaboration with Massachusetts General Hospital — have developed a device that can non-invasively identify the stages of sleep in a patient.

Approximately the size of a laptop and mounted on a wall near the patient, the device measures the minuscule changes in reflected low-power RF signals. The wireless signals are analyzed by a deep neural-network AI and predicts the various sleep stages — light, deep, and REM sleep — of the patient, negating the task of manually combing through the data. Despite the sensitivity of the device, it is able to filter out irrelevant motions and interference, focusing on the breathing and pulse of the patient.

What’s novel here isn’t so much the hardware as it is the processing methodology. The researchers use both convolutional and recurrent neural networks along with what they call an adversarial training regime:

Our training regime involves 3 players: the feature encoder (CNN-RNN), the sleep stage predictor, and the source discriminator. The encoder plays a cooperative game with the predictor to predict sleep stages, and a minimax game against the source discriminator. Our source discriminator deviates from the standard domain-adversarial discriminator in that it takes as input also the predicted distribution of sleep stages in addition to the encoded features. This dependence facilitates accounting for inherent correlations between stages and individuals, which cannot be removed without degrading the performance of the predictive task.

Anyone out there want to give this one a try at home? We’d love to see a HackRF and GNU Radio used to record RF data. The researchers compare the RF to WiFi so repurposing a 2.4 GHz radio to send out repeating uniformed transmissions is a good place to start. Dump it into TensorFlow and report back.

Continue reading “AI Watches You Sleep; Knows When You Dream”

Save Your Thumbs With This Netflix Password Sender

Chances are anyone who has an entry-level to mid-range smart TV knows that setting them up with your streaming account credentials is a royal pain. Akin to the days of texting on a flip phone, using the number pad or arrow keys to compose your user name and password seems to take forever.  So why not avoid the issue with this automated Netflix logger-inner?

As if the initial setup wasn’t bad enough, when [krucho5]’s LG smart TV started asking for his Netflix credentials every few days, he knew something needed to be done. An Arduino to send “keystrokes” was the obvious solution, but when initial attempts to spoof the HID on the set proved fruitless, [krucho5] turned to the IR remote interface. He used an IR receiver module to capture the codes sent while entering user name and password, and an IR LED plays it back anytime the TV ask for it. The video below shows how much easier it is now, and the method should work just fine for any other online service accounts.

We like [krucho5]’s build, but the fit and finish are a little rough. Perhaps slipping them into a pair of Netflix-enabled socks would be a nice touch?

Continue reading “Save Your Thumbs With This Netflix Password Sender”

Color Sensor From An RGB LED And A Photocell

When you need to quantify the color of an object, you’ve got quite a few options. You can throw a Raspberry Pi camera and OpenCV at the problem and approach it through software, or you can buy an off-the-shelf RGB sensor and wire it up to an Arduino. Or you can go back to basics and build this reflective RGB sensor from an LED and a photocell.

The principle behind [TechMartian]’s approach is simplicity itself: shine different colored lights on an object and measure how much light it reflects. If you know the red, green, and blue components of the light that correspond to maximum reflectance, then you know the color of the object. Their sensor uses a four-lead RGB LED, but we suppose a Neopixel could be used as well. The photosensor is a simple cadmium sulfide cell, which measures the intensity of light bouncing back from an object as an Arduino drives the LED through all possible colors with PWM signals. The sensor needs to be white balanced before use but seems to give sensible results in the video below. One imagines that a microcontroller-free design would be possible too, with 555s sweeping the PWN signals and op-amps taking care of detection.

And what’s the natural endpoint for a good RGB sensor? A candy sorter, or course, of which we have many examples, from the sleek and polished to the slightly more hackish.

Continue reading “Color Sensor From An RGB LED And A Photocell”

A Flame Diode Pilot Light Sensor For A Burning Man Installation

A naked flame is a complex soup of ionised gases, that possesses an unexpected property. As you might expect with that much ionisation there is some level of electrical conductivity, but the unusual property comes in that a flame can be made to conduct in only one direction. In other words, it can become a diode of sorts, in a manner reminiscent of a vacuum tube diode.

[Paul Stoffregen] has made use of this phenomenon in a flame detector that he’s built to be installed on a Burning Man flame-based art installation. It forms part of a response to a problem with traditional pilot lights: when the wind blows a pilot light out, a cloud of unignited gas can accumulate. The sensor allows the pilot light to be automatically re-ignited if the flame is no longer present.

The circuit is a surprisingly simple one, with a PNP transistor being turned on by the flame diode being placed in its base circuit. This allows the intensity of the flame to be measured as well as whether or not it is present, and all at the expense of a microscopic current consumption. A capacitor is charged by the transistor, and the charge time is measured by a Teensy that uses it to estimate flame intensity and trigger the pilot light if necessary. Interestingly it comes from a patent that expired in 2013, it’s always worth including that particular line of research in your investigations.

All the construction details are in the page linked above, and you can see the system under test in the video below the break.

Continue reading “A Flame Diode Pilot Light Sensor For A Burning Man Installation”