Look Upon Eyepot, And Weep For Mercy

Hope you weren’t looking forward to a night of sleep untroubled by nightmares. Doing his part to make sure Lovecraftian mechanized horrors have lease in your subconscious, [Paul-Louis Ageneau] has recently unleashed the horror that is Eyepot upon an unsuspecting world. This Cycloptic four legged robotic teapot takes inspiration from an enemy in the game Alice: Madness Returns, and seems to exist for no reason other than to creep people out.

Even if you aren’t physically manifesting nightmares, there’s plenty to learn from this project. [Paul-Louis Ageneau] has done a fantastic job of documenting the build, from the OpenSCAD-designed 3D printed components to the Raspberry Pi Zero and Arduino Pro Mini combo that control the eight servos in the legs. If you want to play along at home all the information and code is here, though feel free to skip the whole teapot with an eyeball thing.

A second post explains how the code is written for both the Arduino and Pi, making for some very illuminating reading. A Python script on the Pi breaks down the kinematics and passes on the appropriate servo angles to the Arduino over a serial link. Combined with a web interface for control and a stream from the teapot’s Raspberry Pi Camera module, and you’ve got the makings of the world’s creepiest telepresence robot. We’d love to see this one stomping up and down a boardroom table.

Seems we are on a roll recently with creepy robot pals. Seeing a collaboration between Eyepot and JARVIS might be too much for us to handle. Though we have a pretty good idea how we’d want to control them.

Taking Halloween To The Next Level With JARVIS

As an avid “Haunt Hacker”, [Steve Koci] knows a thing or two about bringing high-tech to Halloween. Wanting to build a mobile robot that could accompany him to conventions as a demonstration of the sort of animatronic mechanisms and controls he uses, he came up with the idea of JARVIS. The original plan was to make a more traditional robot, but with the addition of an animated skull and some Steampunk-style embellishments, JARVIS is definitely the kind of thing you don’t want to run into on an October night.

Construction of JARVIS started in 2016, after [Steve] saw the Agent 390 tracked robot chassis from ServoCity. With the addition of extra wheels and a custom track, he converted the Agent 390 into a triangular track arrangement which he said he’s had his eye on since “Johnny 5” sported them back in Short Circuit.

There’s a dizzying array of electronics required to make JARVIS move and talk, not least of which is the “Banshee” prop controller. This device is made to simplify the construction of animatronic heads and provides not only organic-looking randomized movement but automatic jaw synchronization. Using a wireless audio connection, [Steve] is able to talk through a speaker mounted on the chest of the robot, while the skull automatically matches its mouth to his speech in real time. Combined with the GoPro in a two-axis gimbal, this allows JARVIS to function as a fairly robust telepresence platform. Much to the delight/horror of those it’s used on.

Getting JARVIS to move requires not only the two beefy motors and a dedicated controller supplied by the Agent 390 platform, but no less than thirteen servos for the head, arms, and grippers. There’s even a linear actuator used to tilt the skull up and down, presumably for terrifying people of various heights and ages. JARVIS even has a pair of Adafruit’s electronic eyes mounted in the skull, as if you thought you would be spared the horror of seeing glowing eyes following you in the dark.

To control all this hardware, [Steve] uses two RC transmitters in conjunction with a smartphone displaying the video feed coming from the GoPro. It takes some serious finger-gymnastics to get JARVIS doing its thing, which [Steve] says he’s still trying to master.

As many projects that have graced these pages can attest to, hackers seem to delight in coming up with new and exciting ways to terrify the young and old alike. Sometimes they can’t even wait until Halloween.

Continue reading “Taking Halloween To The Next Level With JARVIS”

Nerds Unite: Prosthetics Inspired By Comics And Beyond!

Open Bionics is a company creating prosthetics inspired by heroines, heroes and the fictional worlds they live in. The designs emblazoned on their first set of bionic hands include ones drawn from Queen Elsa from Disney’s Frozen, and Marvel’s Iron Man. The best thing about what they are doing is they offer you, dear reader, a chance to lend your own super powers of design and engineering. Open Bionics offers up 3D print files for several hand designs, hardware schematics and design files for their controller boards, firmware, and software to control the robotic hands with. Other than their website, you can also find all of the files and more on their GitHub account. If you’d like to devote a good amount of time and become a developer, they have a form to contact them through. To help with sourcing parts for your own build, they sell cables for tendons, muscle sensors, and fingertip grips in their online store

 We first came to learn about this company through a tipster [Dj Biohazard] who pointed to a post about their partnership with an 11-year-old Tilly, who is pictured on the left. Her bionic hand is an Open Bionics prototype whose design is based on the video game, Deus Ex. The best way products like these are improved are through the open source community and people like her.

Specific improvements Open Bionics state on their website are:

  • The customised bionic arms are manufactured in under 24 hours and the revolutionary socket adjusts as the child grows.
  • The bionic arms are light and small enough for those as young as eight.
  • The bionic arms use myoelectric skin sensors to detect the user’s muscle movements, which can be used to control the hand and open and close the fingers.

Read more about Tilly’s story and her partnership with Open Bionic’s on Womanthology. Tilly seems to have a dream of her own to “make prosthetics a high fashion piece – something that amputees can be proud to wear.” 

We at Hackaday have written about several open source prosthetic developments such as a five-day event S.T.E.A.M. Fabrikarium program taking place at Maker’s Asylum in Mumbai and the work of [Nicholas Huchet]What superhuman inspired designs would you create? 

This 3D-Printed Robotic Vacuum Sucks

After you’ve taken a moment to ponder the turn of phrase used in the title, take a look at this scratch-built robotic vacuum created by [theking3737]. The entire body of the vacuum was 3D printed, and all of the internal electronics are off-the-shelf modular components. We can’t say how well it stacks up against the commercial equivalents from iRobot and the like, but it doesn’t look like it would be too hard to build one yourself to find out.

The body of this rather concerned-looking robot was printed on a DMS DP5 printer, which is a neat trick as it only has a build platform of 200 mm x 200 mm. Once all the pieces were printed, a 3D pen was used to “weld” the sections together. The final result looks a bit rough, but should give a bond that’s just as strong as the printed parts themselves.

The robot has four sets of ultrasonic range finders to detect walls and obstacles, though probably not in the positions you would expect. The right side of the robot has two sets of sensors, while the left side only gets one. We aren’t sure the reasoning behind the asymmetrical layout, but presumably the machine prefers making right turns.

Control is provided by an Arduino Mega and the ever-reliable HC-05 Bluetooth module. A companion Android application was written which allows configuring the robot without having to plug into the Arduino every time you want to tweak a setting.

We can’t say we’ve seen that many DIY robotic vacuums here at Hackaday, but we’ve certainly featured our fair share of hacks for the commercially available models.

Here’s Why Hoverboard Motors Might Belong In Robots

[madcowswe] starts by pointing out that the entire premise of ODrive (an open-source brushless motor driver board) is to make use of inexpensive brushless motors in industrial-type applications. This usually means using hobby electric aircraft motors, but robotic applications sometimes need more torque than those motors can provide. Adding a gearbox is one option, but there is another: so-called “hoverboard” motors are common and offer a frankly outstanding torque-to-price ratio.

A teardown showed that the necessary mechanical and electrical interfacing look to be worth a try, so prototyping has begun. These motors are really designed for spinning a tire on the ground instead of driving other loads, but [madcowswe] believes that by adding an encoder and the right fixtures, these motors could form the basis of an excellent robot arm. The ODrive project was a contender for the 2016 Hackaday Prize and we can’t wait to see where this ends up.

Open Source Motor Controller Makes Smooth Moves With Anti-Cogging

Almost two years ago, a research team showed that it was possible to get fine motor control from cheap, brushless DC motors. Normally this is not feasible because the motors are built-in such a way that the torque applied is not uniform for every position of the motor, a phenomenon known as “cogging”. This is fine for something that doesn’t need low-speed control like a fan motor, but for robotics it’s a little more important. Since that team published their results, though, we are starting to see others implement their own low-speed brushless motor controllers.

The new method of implementing anti-cogging maps out the holding torque required for any position of the motor’s shaft so this information can be used later on. Of course this requires a fair amount of calibration; [madcowswe] reports that this method requires around 5-10 minutes of calibration. [madcowswe] also did analysis of his motors to show how much harmonic content is contained in these waveforms, which helps to understand how this phenomenon arises and how to help eliminate it.

While [madcowswe] plans to add more features to this motor control algorithm such as reverse-mapping, scaling based on speed, and better memory usage, it’s a good implementation that has visible improvements over the stock motors. The original research is also worth investigating if a cheaper, better motor is something you need.

Friction Differential Drive Is A Laser-Cut Triumph

Here on Hackaday, too often do we turn our heads and gaze at the novelty of 3D printing functional devices. It’s easy to forget that other techniques for assembling functional prototypes exist. Here, [Reuben] nails the aspect of functional prototyping with the laser cutter with a real-world application: a roll-pitch friction differential drive built from just off-the shelf and laser-cut parts!

The centerpiece is held together with friction, where both the order of assembly and the slight wedged edge made from the laser cutter kerf keeps the components from falling apart. Pulleys transfer motion from the would-be motor mounts, where the belts are actually tensioned with a roller bearing mechanism that’s pushed into position. Finally, the friction drive itself is made from roller-blade wheels, where the torque transferred to the plate is driven by just how tightly the top screw is tightened onto the wheels. We’d say that [Reuben] is pushing boundaries with this build–but that’s not true. Rather, he’s using a series of repeatable motifs together to assemble a both beautiful and complex working mechanism.

This design is an old-school wonder from 2012 uncovered from a former Stanford course. The legendary CS235 aimed to teach “unmechanically-minded” roboticists how to build a host of mechanisms in the same spirit as MIT’s How-to-make-almost-Anything class. While CS235 doesn’t exist anymore, don’t fret. [Reuben] kindly posted his best lectures online for the world to enjoy.

Continue reading “Friction Differential Drive Is A Laser-Cut Triumph”