An Interview With Alex Williams, Grand Prize Winner

Alex Williams pulled off an incredible engineering project. He developed an Autonomous Underwater Vehicle (AUV) which uses a buoyancy engine rather than propellers as its propulsion mechanism and made the entire project Open Source and Open Hardware.

The design aims to make extended duration missions a possibility by using very little power to move the vessel. What’s as remarkable as the project itself is that Alex made a goal for himself to document the project to the level that it is fully reproducible. His success in both of these areas is what makes the Open Source Underwater Glider the perfect Grand Prize winner for the 2017 Hackaday Prize.

We got to sit down with Alex the morning after he won to talk about the project and the path he took to get here.

Continue reading “An Interview With Alex Williams, Grand Prize Winner”

Gorgeous Engineering Inside Wheels Of A Robotic Trail Buddy

Robots are great in general, and [taylor] is currently working on something a bit unusual: a 3D printed explorer robot to autonomously follow outdoor trails, named Rover. Rover is still under development, and [taylor] recently completed the drive system and body designs, all shared via OnShape.

Rover has 3D printed 4.3:1 reduction planetary gearboxes embedded into each wheel, with off the shelf bearings and brushless motors. A Raspberry Pi sits in the driver’s seat, and the goal is to use a version of NVIDA’s TrailNet framework for GPS-free navigation of paths. As a result, [taylor] hopes to end up with a robotic “trail buddy” that can be made with off-the-shelf components and 3D printed parts.

Moving the motors and gearboxes into the wheels themselves makes for a very small main body to the robot, and it’s more than a bit strange to see the wheel spinning opposite to the wheel’s hub. Check out the video showcasing the latest development of the wheels, embedded below.

Continue reading “Gorgeous Engineering Inside Wheels Of A Robotic Trail Buddy”

Real-Life Electronic Neurons

All the kids down at Stanford are talking about neural nets. Whether this is due to the actual utility of neural nets or because all those kids were born after AI’s last death in the mid-80s is anyone’s guess, but there is one significant drawback to this tiny subset of machine intelligence: it’s a complete abstraction. Nothing called a ‘neural net’ is actually like a nervous system, there are no dendrites or axions and you can’t learn how to do logic by connecting neurons together.

NeruroBytes is not a strange platform for neural nets. It’s physical neurons, rendered in PCBs and Molex connectors. Now, finally, it’s a Kickstarter project, and one of the more exciting educational electronic projects we’ve ever seen.

Regular Hackaday readers should be very familiar with NeuroBytes. It began as a project for the Hackaday Prize all the way back in 2015. There, it was recognized as a finalist for the Best Product, Since then, the team behind NeuroBytes have received an NHS grant, they’re certified Open Source Hardware through OSHWA, and there are now enough NeuroBytes to recreate the connectome of a flatworm. It’s doubtful the team actually has enough patience to recreate the brain of even the simplest organism, but is already an impressive feat.

The highlights of the NeuroBytes Kickstarter include seven different types of neurons for different sensory systems, kits to test the patellar reflex, and what is probably most interesting to the Hackaday crowd, a Braitenberg Vehicle chassis, meant to test the ideas set forth in Valentino Braitenberg’s book, Vehicles: Experiments in Synthetic Psychology. If that book doesn’t sound familiar, BEAM robots probably do; that’s where the idea for BEAM robots came from.

It’s been a long, long journey for [Zach] and the other creators of NeuroBytes to get to this point. It’s great that this project is now finally in the wild, and we can’t wait to see what comes of it. Hopefully a full flatworm connectome.

Bluetooth Photo Booth Gets Vetting At Wedding

With just two weeks to go before his friends’ wedding, [gistnoesis] built a well-featured robotic photo booth. Using a Bluetooth PS3 controller, guests could move the camera around, take a picture, style it in one of several ways (or not), and print it out with a single button press.

The camera is mounted on a DIY 2-axis gimbal made from extruded aluminium and 3D-printed parts. It can be moved left/right with one joystick, and up/down with the other. [gistnoesis] set up a four-panel split-screen display that shows the live feed from the camera and a diagram for the controls. The third panel shows the styled picture. Guests could explore the camera roll on the fourth panel.

LINN uses two PCs running Lubuntu, one of which is dedicated to running an open-source neural style transfer program. After someone takes a picture, they can change the style to make it look like a Van Gogh or Picasso before printing it out. A handful of wedding attendees knew about some of the extra features, like manual exposure control and the five-second timer option, and the information spread gradually. Not only was LINN a great conversation piece, it inspired multi-generational collaboration.

Despite the assembled size, LINN packs up nicely into a couple of reusable shopping bags for transport (minus the TV, of course).  This vintage photo booth we saw a few years ago is more of a one-piece solution, although it isn’t as feature-rich.

Continue reading “Bluetooth Photo Booth Gets Vetting At Wedding”

3D-Printed Robot Golem Only A Tiny Bit Creepy

ASPIR, the Autonomous Support and Positive Inspiration Robot is an goblin-sized robot, designed by [John Choi], aims to split the difference between smaller hobbyist robots and more robust but pricy full-sized humanoids only a research institute could afford. By contrast, [John] estimates it cost a relatively meager $2,500 to create such a homunculus.

The robot consists of 33 servos of various types moving the limb, controlled by an Arduino Mega with a servo control shield seated on it. The chassis uses 5 kg of filament and took 300 hours to print, and it has a skeleton made up of aluminum hex rods. Spring-loaded RC shocks help reinforce the shoulders. There are some nice touches, like 3D-printed hands with living hinge fingers, each digit actuated by a metal-gear micro servo. It stores its power bricks in its shins. For sensors it includes a chest-mounted webcam and a laser distance sensor.

The main design feature is the Android smartphone serving as its brains, and also — at least cosmetically — its eyes. Those eyes… might be just a teensy bit too Chucky for our taste. (Nice work, [John]!)

Robotic Arm Rivals Industrial Counterparts

We’ve seen industrial robotic arms in real life. We’ve seen them in classrooms and factories. Before today, we’ve never mistaken a homemade robotic arm for one of the price-of-a-new-home robotic arms. Today, [Chris Annin] made us look twice when we watched the video of his six-axis robotic arm. Most of the DIY arms have a personal flare from their creator so we have to assume [Chris Annin] is either a robot himself or he intended to build a very clean-looking arm when he started.

He puts it through its paces in the video, available after the break, by starting with some stretches, weight-lifting, then following it up and a game of Jenga. After a hard day, we see the arm helping in the kitchen and even cracking open a cold one. At the ten-minute mark, [Chris Annin] walks us through the major components and talks about where to find many, many more details about the arm.

Many of the robotic arms on Hackaday are here by virtue of resourcefulness, creativity or unusual implementation but this one is here because of its similarity to the big boys.

Continue reading “Robotic Arm Rivals Industrial Counterparts”

Open Source Motor Controller Makes Smooth Moves With Anti-Cogging

Almost two years ago, a research team showed that it was possible to get fine motor control from cheap, brushless DC motors. Normally this is not feasible because the motors are built-in such a way that the torque applied is not uniform for every position of the motor, a phenomenon known as “cogging”. This is fine for something that doesn’t need low-speed control like a fan motor, but for robotics it’s a little more important. Since that team published their results, though, we are starting to see others implement their own low-speed brushless motor controllers.

The new method of implementing anti-cogging maps out the holding torque required for any position of the motor’s shaft so this information can be used later on. Of course this requires a fair amount of calibration; [madcowswe] reports that this method requires around 5-10 minutes of calibration. [madcowswe] also did analysis of his motors to show how much harmonic content is contained in these waveforms, which helps to understand how this phenomenon arises and how to help eliminate it.

While [madcowswe] plans to add more features to this motor control algorithm such as reverse-mapping, scaling based on speed, and better memory usage, it’s a good implementation that has visible improvements over the stock motors. The original research is also worth investigating if a cheaper, better motor is something you need.