Locating Targets With Charm Courtesy Of A Life Size Portal Turret

What better way to count down the last 7 weeks to a big hacker camp like SHA2017 than by embarking on a last-minute, frantic build? That was [Yvo]’s thought when he decided to make a life-sized version of the adorably lethal turrets from the Valve’s Portal video games. Since that build made it to the finish line back then with not all features added, he finished it up for the CCC camp 2019 event, including the ability to close, open, target and shoot Nerf darts.

Originally based on the miniature 2014 turret (covered on Hackaday as well), [Yvo] details this new project in a first and second work log, along with a detailed explanation of how it all goes together and works. While the 2017 version took a mere 50 days to put together, the whole project took about 300 hours of 3D printing. It also comes with four Nerf guns which use flywheels to launch the darts.  The wheels are powered using quadcopter outrunner motors that spin at 25,000 RPM. The theoretical speed of a launched dart is over 100km/h, with 18 darts per gun and a fire rate of 2 darts per second.

The basic movement control for the system is handled by an Arduino Mega, while the talking and vision aspects are taken care of by a Raspberry Pi 3+, which ultimately also makes the decisions about how to move the system. As one can see in the video after the link, the system seems to work pretty well, with a negligible number of fatalities among company employees.

Though decidedly not a project for the inexperienced tinkerer, [Yvo] has made all of the design files available along with the software. We’re still dubious about the claims about the promised cake for completing one of these turrets, however.

Continue reading “Locating Targets With Charm Courtesy Of A Life Size Portal Turret”

DIY Personal Assistant Robot Hears And Sees All

Who wouldn’t want a robot that can fetch them a glass of water? [Saral Tayal] didn’t just think that, he jumped right in and built his own personal assistant robot. This isn’t just some remote-controlled rover though. The robot actually listens to his voice and recognizes his face.

The body of the robot is the common “Rover 5” platform, to which [Saral] added a number of 3D printed parts. A forklift like sled gives the robot the ability to pick things up. Some of the parts are more about form than function – [Saral] loves NASA’s Spirit and Opportunity Mars rovers, so he added some simulated solar cells and other greebles.

The Logitech webcam up front is very functional — images are fed to machine learning models, while audio is processed to listen for commands. This robot can find and pick up 90 unique objects.

The robot’s brains are a Raspberry Pi. It uses TensorFlow for object recognition. Some of the models [Saral] is using are pretty large – so big that the Pi could only manage a couple of frames per second at 100% CPU utilization. A Google Coral coprocessor sped things up quite a bit, while only using about 30% of the Pi’s processor.

It takes several motors to control to robot’s tracks and sled. This is handled by two Roboclaw motor controllers which themselves are commanded by the Pi.

We’ve seen quite a few mobile robot rovers over the years, but [Saral’s] ‘bot is one of the most functional designs out there. Even better is the fact that it is completely open source. You can find the code and 3D models on his GitHub repo.

Check out a video of the personal assistant rover in action after the break.

Continue reading “DIY Personal Assistant Robot Hears And Sees All”

Designing An Advanced Autonomous Robot: Goose

Robotics is hard, maybe not quite as difficult as astrophysics or understanding human relationships, but designing a competition winning bot from scratch was never going to be easy. Ok, so [Paul Bupe, Jr’s] robot, named ‘Goose’, did not quite win the competition, but we’re very interested to learn what golden eggs it might lay in the aftermath.

The mechanics of the bot is based on a fairly standard dual tracked drive system that makes controlling a turn much easier than if it used wheels. Why make life more difficult than it is already? But what we’re really interested in is the design of the control system and the rationale behind those design choices.

The diagram on the left might look complicated, but essentially the system is based on two ‘brains’, the Teensy microcontroller (MCU) and a Raspberry Pi, though most of the grind is performed by the MCU. Running at 96 MHz, the MCU is fast enough to process data from the encoders and IMU in real time, thus enabling the bot to respond quickly and smoothly to sensors. More complicated and ‘heavier’ tasks such as LIDAR and computer vision (CV) are performed on the Pi, which runs ‘Robot operating system’ (ROS), communicating with the MCU by means of a couple of ‘nodes’.

The competition itself dictated that the bot should travel in large circles within the walls of a large box, whilst avoiding particular objects. Obviously, GPS or any other form of dead reckoning was not going to keep the machine on track so it relied heavily on ‘LiDAR point cloud data’ to effectively pinpoint the location of the robot at all times. Now we really get to the crux of the design, where all the available sensors are combined and fed into a ‘particle filter algorithm’:

What we particularly love about this project is how clearly everything is explained, without too many fancy terms or acronyms. [Paul Bupe, Jr] has obviously taken the time to reduce the overall complexity to more manageable concepts that encourage us to explore further. Maybe [Paul] himself might have the time to produce individual tutorials for each system of the robot?

We could well be reading far too much into the name of the robot, ‘Goose’ being Captain Marvel’s bazaar ‘trans-species’ cat that ends up laying a whole load of eggs. But could this robot help reach a de-facto standard for small robots?

We’ve seen other competition robots on Hackaday, and hope to see a whole lot more!

Video after the break: Continue reading “Designing An Advanced Autonomous Robot: Goose”

An Exoskeleton Arm For A Hacker On A Budget

Whether it is motivated by a dream of superhuman strength courtesy of a mech suit or of mobility for those with impaired muscle function, the powered exoskeleton exerts a curious fascination among engineers. The idea of a machine-augmented human body achieving great things is thwarted though by the difficulty of the task, actuators and power sources small enough to be worn comfortably represent a significant challenge that is not easily overcome. It’s a subject that has captivated [Kristjan Berce] since at a young age seeing his grandmother struggling with lifting, and he presents a working powered exoskeleton arm as a proof of his ideas.

It’s a wonderful exercise in low-tech construction with hand tools and a drill press on pieces of aluminium and wood. Motive power comes from an automotive windscreen wiper motor, and electrical power comes from a hefty LiPo attached to the device’s harness. There is a feedback potentiometer incorporated into the elbow joint, and an Arduino oversees the operation under the direction of a pair of glove-mounted buttons. It’s certainly impressive to see it in the video below lifting a bicycle, though we wonder how its weight might affect someone with less muscle function than average.

Projects like this one are very good to see, because there’s a chance that somebody out there may be helped by building one of these. However there is always a note of caution to be struck, as the best solutions come from those who need them and not those who merely think they have the solution. We have written about the Engineer Saviour Trap here in years past.

This isn’t the first prosthetic arm we’ve seen though, we covered a hackerspace in England printing one for a local youngster.

Continue reading “An Exoskeleton Arm For A Hacker On A Budget”

Pick And Place Robot Built With Fischertechnik

We’d be entirely wrong to think that Fichertechnik is just a toy for kids. It’s also perfect for prototyping the control system of robots. [davidatfsg]’s recent entry in the Hackaday Prize, Delta Robot, shows how complex robotics can be implemented without the hardship of having to drill, cut, bolt together or weld components. The added bonus is that the machine can be completely disassembled non-destructively and rebuilt with a new and better design with little or no waste.

The project uses inverse kinematics running on an Arduino Mega to pick coloured objects off a moving conveyor belt and drop them in their respective bins. There’s also also an optical encoder for regulating the speed of the conveyor and a laser light beam for sensing that the object on the conveyor has reached the correct position to be picked.

Not every component is ‘off the shelf’. [davidatfsg] 3D printed a simple nozzle for the actual ‘pick’ and the vacuum required was generated by the clever use of a pair of pneumatic cylinders and solenoid operated air valves.

We’re pretty sure that this will not be the last project on Hackaday that uses Fischertechnik components and it’s the second one that [davidatfsg] has concocted. Videos of the machine working after the break! Continue reading “Pick And Place Robot Built With Fischertechnik”

Etch-A-Selfie

Taking a selfie before the modern smartphone era was a true endeavor. Flip phones didn’t have forward-facing cameras, and if you want to go really far back to the days of film cameras, you needed to set a timer on your camera and hope, or get a physical remote shutter. You could also try and create a self portrait on an Etch a Sketch, too, but this would take a lot of time and artistic skill. Luckily in the modern world, we can bring some of this old technology into the future and add a robot to create interesting retro selfies – without needing to be an artist.

The device from [im-pro] attaches two servos to the Etch a Sketch knobs. This isn’t really a new idea in itself, but the device also includes a front-facing camera, taking advantage of particularly inexpensive ESP32 Camera modules. Combining the camera features with [Bart Dring]’s ESP32 Grbl port is a winner. Check the code in [im-pro]’s GitHub.

Once the picture is taken, the ESP32 at the heart of the build handles the image processing and then drawing the image on the Etch a Sketch. The robot needs a black and white image to draw, and an algorithm for doing it without “lifting” the drawing tool, and these tasks stretch the capabilities of such a small processor. It takes some time to work, but in the end the results speak for themselves.

The final project is definitely worth looking for, if not for the interesting ESP32-controlled robot than for the image processing algorithim implementation. The ESP32 is a truly versatile platform, though, and is useful for building almost anything.

Continue reading “Etch-A-Selfie”

A 3D-Printable Mecanum Wheeled Robot Platform

If your interest lies with robotics there are a multitude of different platforms for you to build. [Teemu Laurila] was frustrated with what was on offer, so designed his own with four-wheel double wishbone suspension and mecanum wheels for maximum flexibility.

It’s a design that has been through multiple revisions since its first iteration in 2015, and along the way it’s clear some thought has gone into it. That double wishbone suspension features an angle for a high ground clearance, and is fully sprung. Drive comes from small motor/gearboxes at each axle. The chassis meanwhile has plenty of space for a single-board computer, and has been specifically designed with the BeagleBone Black in mind.

This build isn’t fully DIY, as the mecanum wheels appear to be off-the-shelf items, but the rest of the project makes up for this. If you need to make your own, it’s hardly as though there aren’t any projects from which you can borrow components.

Continue reading “A 3D-Printable Mecanum Wheeled Robot Platform”