Impressive Drawing Machine For One Made So Simply

Not all of us have CNC machines, laser cutters and 3D printers, and I’ll bet most of us didn’t start out that well equipped. The low-cost drawing machine that [jegatheesan] made for his daughter reminds us that you can prototype, and then make a functioning mechanical Da Vinci with very basic materials and mostly hand tools. He also wrote his own drawing software, with an interface that has its own simplicity.

There really are a lot of things to like about [jegatheesan]’s project. He first works out the math himself by doing something the likes of which we’ve all enjoyed, digging out the old school trigonometry and algebra books for a refresher. Then he got started on his prototype, made using a cardboard tube for the main support and straws and safety pins for the drawing arms. He already had a motor shield for his Arduino but it supported only 2 servos, so he made his own 3-servo shield. In the end, the prototype told him he had to redo some calculations, allowing him to move on to the final machine.

One thing we can say about the final machine is that hot glue must truly be the maker’s connect-all — you won’t find many screws here. Even the servos are held in place with copious quantities of glue. And the mechanism for lifting the pen is also quite clever. The whole thing is mounted on two vertical guide rods, so that it can easily slide up and down. To get it to actually move up and down, he glued a toy car wheel off-center on a servo arm. When the servo turns, the off-center wheel acts like a cam, pushing down on the wooden base to either lift the machine up or lower it down, depending on where the wheel is in its rotation.

See his hackaday.io page for the full step-by-step development process. But first check out the videos below to see how impressive such a simply made machine is in action.

Continue reading “Impressive Drawing Machine For One Made So Simply”

Tracing A Scene An Old-Fashioned Way

Taking a picture is as simple as tapping a screen. Drawing a memorable scene, even when it’s directly in front of you, is a different skill entirely. So trace it! Well, that’s kind of hard to do without appropriate preparation.

[bobsteaman]’s method is to first whip up a pantograph — it tested well with a felt marker on the end. Next, he built a camera obscura into a small wood box with a matte plexiglass top, which didn’t work quite so well. A magnifying glass above the camera’s pinhole aperture helped, but arduous testing was needed to ensure it was set at perfect position for a clear image. The matte plexiglass was also thrown out and, after some experimentation, replaced with a sheet of semi-transparent baking paper sandwiched between two pieces of clear plexiglass. The result is hard to argue with.

Continue reading “Tracing A Scene An Old-Fashioned Way”

Hack Together A Whack-A-Mole In A Box!

Here’s a project that you can throw together in an afternoon, provided you have the parts on hand, and is certain to entertain. Hackaday.io user [SunFounder] walks us through the process of transforming a humble cardboard box into a whack-a-mole game might be just the ticket to pound out some stress or captivate any children in the vicinity.

A multi-control board and nine arcade buttons are the critical pieces of hardware here, with wires and a USB cable rounding out  the rest of the electronics. Separate the button core from the upper shell, mounting the shell in the box, and connect the button core’s LED cathode to the button’s ON terminal. Repeat eight times. Solder the buttons in parallel and add some more wire to the buttons’ ON terminals to extend their reach. Repeat eight more times.

Place the finished LED+cores into the buttons and connect their ON terminals to their respective buttons on the multi control board. Now for the hard step: use a mini-USB to USB cable to connect the controller to a computer you want to use to run the game’s code in the Arduino IDE. Modify the key-mappings and away you go! Check out the build video after the break.

Continue reading “Hack Together A Whack-A-Mole In A Box!”

How Low-Power Can You Go?

[lasersaber] has a passion: low-power motors. In a bid to challenge himself and inspired by betavoltaic cells, he has 3D printed and built a small nuclear powered motor!

This photovoltaic battery uses fragile glass vials of tritium extracted from keychains and a small section of a solar panel to absorb the light, generating power. After experimenting with numerous designs, [lasersaber] went with a 3D printed pyramid that houses six coils and three magnets, encapsulated in a glass cloche and accompanied by a suitably ominous green glow.

Can you guess how much power and current are coursing through this thing? Guess again. Lower. Lower.

Under 200mV and 20nA!

Continue reading “How Low-Power Can You Go?”

A Compact, Portable Pantograph Camera Slider

Ho, hum, another camera slider, right? Wrong — here’s a camera slider with a literal twist.

What sets [Schijvenaars]’ slider apart from the pack is that it’s not a slider, at least not in the usual sense. A slider is a mechanical contrivance that allows a camera to pan smoothly during a shot. Given that the object is to get a camera from point A to point B as smoothly as possible, and that sliders are often used for long exposures or time-lapse shots, the natural foundation for them is a ball-bearing linear slide, often powered by a stepper motor on a lead screw. [Schijvenaars] wanted his slider to be more compact and therefore more portable, so he designed and 3D-printed a 3-axis pantograph mechanism. The video below shows the slider panning the camera through a silky smooth 60 centimeters; a bonus of the arrangement is that it can transition from panning in one direction to the other without any jerking. Try that with a linear slider.

Granted, this slider is not powered, but given that the axes are synced with timing belts, it wouldn’t be difficult to add a motor. We’ve seen a lot of sliders before, from simple wooden units to complicated overhead cranes, but this one seems like a great design with a lot of possibilities.

Continue reading “A Compact, Portable Pantograph Camera Slider”

Open Source Barbot Needs Only Two Motors

Most drinkbots are complicated—some intentionally so, others seemingly by design necessity. If you have a bunch of bottles of booze you still need a way to get it out of the bottles in a controlled fashion, usually through motorized pouring or pumping. Sometimes all thoe tubes and motors and wires looks really cool and sci fi. Still, there’s nothing wrong with a really clean design.

[Lukas Šidlauskas’s] Open Source Barbot project uses only two motors to actuate nine bottles using only a NEMA-17 stepper to move the tray down along the length of the console and a high-torque servo to trigger the Beaumont Metrix SL spirit measures. These barman’s bottle toppers dispense 50 ml when the button is pressed, making them (along with gravity) the perfect way to elegantly manage so many bottles. Drink selection takes place on an app, connected via Bluetooth to the Arduino Mega running the show.

The Barbot is an Open Source project with project files available from [Lukas]’s GitHub repository
and discussions taking place in a Slack group.

If it’s barbots you’re after, check out this Synergizer 4-drink barbot and the web-connected barbot we published a while back.

Continue reading “Open Source Barbot Needs Only Two Motors”

AI Watches You Sleep; Knows When You Dream

If you’ve never been a patient at a sleep laboratory, monitoring a person as they sleep is an involved process of wires, sensors, and discomfort. Seeking a better method, MIT researchers — led by [Dina Katabi] and in collaboration with Massachusetts General Hospital — have developed a device that can non-invasively identify the stages of sleep in a patient.

Approximately the size of a laptop and mounted on a wall near the patient, the device measures the minuscule changes in reflected low-power RF signals. The wireless signals are analyzed by a deep neural-network AI and predicts the various sleep stages — light, deep, and REM sleep — of the patient, negating the task of manually combing through the data. Despite the sensitivity of the device, it is able to filter out irrelevant motions and interference, focusing on the breathing and pulse of the patient.

What’s novel here isn’t so much the hardware as it is the processing methodology. The researchers use both convolutional and recurrent neural networks along with what they call an adversarial training regime:

Our training regime involves 3 players: the feature encoder (CNN-RNN), the sleep stage predictor, and the source discriminator. The encoder plays a cooperative game with the predictor to predict sleep stages, and a minimax game against the source discriminator. Our source discriminator deviates from the standard domain-adversarial discriminator in that it takes as input also the predicted distribution of sleep stages in addition to the encoded features. This dependence facilitates accounting for inherent correlations between stages and individuals, which cannot be removed without degrading the performance of the predictive task.

Anyone out there want to give this one a try at home? We’d love to see a HackRF and GNU Radio used to record RF data. The researchers compare the RF to WiFi so repurposing a 2.4 GHz radio to send out repeating uniformed transmissions is a good place to start. Dump it into TensorFlow and report back.

Continue reading “AI Watches You Sleep; Knows When You Dream”