Landbeest, A Single Servo Walking Robot

Walking robots have a rich history both on and off the storied pages of Hackaday, but if you will pardon the expression, theirs is not a field that’s standing still. It’s always pleasing to see new approaches to old problems, and the Landbeest built by [Dejan Ristic] is a great example.

It’s a four-legged walker with a gait dictated by a cam-and-follower mechanism that allows it to perform the full range of leg movement with only one motor. Each cam can control more than one leg in synchronisation, and in his most recent prototype, there are two such mechanisms that work on opposite corners of a four-legged machine. The legs are arranged in such a way that the two corner-to-corner pairs pivot at their centres in a similar manner to a pair of scissors; allowing a servo to steer the robot as it walks.

The result certainly isn’t as graceful as [Theo Janssen]’s Strandbeest, from which it evidently takes inspiration for its name, but it’s no less capable for it. After the break you can see a video he’s posted which clearly illustrates its operation and demonstrates its ability to traverse obstacles.

The only thing that’s missing are the files and software should you wish to create your own. He’s unapologetic about this, pointing out that he’d prefer to wait until he is satisfied with it before letting it go. Since he’s put a lot of work in so far and shows no sign of stopping, we’re sure he’ll reach that point soon enough.

Continue reading “Landbeest, A Single Servo Walking Robot”

Robotic Skin Sees When (and How) You’re Touching It

Cameras are getting less and less conspicuous. Now they’re hiding under the skin of robots.

A team of researchers from ETH Zurich in Switzerland have recently created a multi-camera optical tactile sensor that is able to monitor the space around it based on contact force distribution. The sensor uses a stack up involving a camera, LEDs, and three layers of silicone to optically detect any disturbance of the skin.

The scheme is modular and in this example uses four cameras but can be scaled up from there. During manufacture, the camera and LED circuit boards are placed and a layer of firm silicone is poured to about 5 mm in thickness. Next a 2 mm layer doped with spherical particles is poured before the final 1.5 mm layer of black silicone is poured. The cameras track the particles as they move and use the information to infer the deformation of the material and the force applied to it. The sensor is also able to reconstruct the forces causing the deformation and create a contact force distribution. The demo uses fairly inexpensive cameras — Raspberry Pi cameras monitored by an NVIDIA Jetson Nano Developer Kit — that in total provide about 65,000 pixels of resolution.

Apart from just providing more information about the forces applied to a surface, the sensor also has a larger contact surface and is thinner than other camera-based systems since it doesn’t require the use of reflective components. It regularly recalibrates itself based on a convolutional neural network pre-trained with data from three cameras and updated with data from all four cameras. Possible future applications include soft robotics, improving touch-based sensing with the aid of computer vision algorithms.

While self-aware robotic skins may not be on the market quite so soon, this certainly opens the possibility for robots that can detect when too much force is being applied to their structures — the machine equivalent sensation to pain.

Continue reading “Robotic Skin Sees When (and How) You’re Touching It”

Little Flash Charges In 40 Seconds Thanks To Super Capacitors

We’ve all committed the sin of making a little arduino robot and running it off AA batteries. Little Flash is better than that and runs off three 350 F capacitors.

In fact, that’s the entire mission of the robot. [Mike Rigsby] wants people to know there’s a better way. What’s really cool is that 10 A for 40 seconds lets the robot run for over 25 minutes!

The robot itself is really simple. The case is 3D printed with an eye towards simplicity. The brains are an Arduino nano and the primary input is a bump sensor. The robot runs around randomly, but avoids getting stuck with the classic reverse-and-turn on collision.

It’s cool to see how far these capacitors have come. We remember people wondering about these high priced specialty parts when they first dropped on the hobby scene, but they’re becoming more and more prevalent compared to other solutions such as coin-cells and solder tab lithium batteries for PCB power solutions.

This Arduino Keeps Its Eyes On You

[Will] wanted to build some animatronic eyes that didn’t require high-precision 3D printing. He wound up with a forgiving design that uses an Arduino and six servo motors. You can see the video of the eyes moving around in the video below.

The bill of materials is pretty simple and features an Arduino, a driver board, and a joystick. The 3D printing parts are easy to print with no supports, and will work with PLA. Other than opening up holes there wasn’t much post-processing required, though he did sand the actual eyeballs which sounds painful.

Continue reading “This Arduino Keeps Its Eyes On You”

FitSocket Is A Portal To Better Prostheses

Traditionally, sockets for prostheses are created by making a plaster cast of the limb being fitted, and are then sculpted in carbon fiber. It’s an expensive and time-consuming process, and what is supposed to be a customized socket often turns out to be an uncomfortable disappointment. Though prosthetists design these sockets specifically to take pressure off of the more rigid areas of tissue, this usually ends up putting more pressure on the softer areas, causing pain and discomfort.

An MIT team led by [Arthur Preton] wants to make prosthesis sockets more comfortable and better customized. They created FitSocket, a machine that assesses the rigidity of limb tissue. You can see it in motion after the break.

FitSocket is essentially a ring of 14 actuators that gently prod the limb and test how much pressure it takes to push in the tissue. By repeating this process over the entire limb, [Preton] can create a map that shows the varying degrees of stiffness or softness in the tissue.

We love to see advancements in prostheses. Here’s an electronic skin that brings feeling to artificial fingertips.

Continue reading “FitSocket Is A Portal To Better Prostheses”

Sensing, Connected, Utility Transport Taxi For Level Environments

If that sounds like a mouthful, just call it SCUTTLE – the open-source mobile robot designed at Texas A&M University. SCUTTLE is a low cost (under $350) robot designed for teaching Aggies at the Multidisciplinary Engineering Technology (MXET) program, where it is used for in-lab lessons and semester projects for the MXET 300 – Mobile Robotics undergraduate course. Since it is designed for academic purposes, the robot is very well documented, making it easy to replicate when you follow the instructions. In fact, the team is looking for others to build SCUTTLE’s and give them feedback in order to improve its design.

Available on the SCUTTLE website are a large collection of videos to walk you through fabrication, electronics setup, robot assembly, programming, and robot operation. They are designed to help students build and operate the mobile robot within one semester. Most of the mechanical and electronics parts needed for the robot are off-the-shelf and easy to procure and the rest of the custom parts can be easily 3D printed. Its modular design allows you the freedom to try different options, features and upgrades. SCUTTLE is powerful enough to carry a payload up to 9 kg (20 pounds) allowing additional hardware to be added. To keep cost low and construction easy, the robot uses a simple, two wheel drive system, using a pair of geared motors. This forces the robot to literally scuttle in a “non-holonomic” fashion to move from origin to destination in a sequence of left / right turns and forward moves, so motion planning is interestingly tricky.

The SCUTTLE robot is programmed using Python3 running under Linux and has been tested working on either a BeagleBone Blue or a Raspberry Pi. The SCUTTLE software guide is a good place to get acquainted with the system architecture.

The standard configuration uses ultrasonic sensors for collision avoidance, a standard USB camera for vision, and encoders coupled to the wheel drive pulleys for determining position with respect to the starting origin. An optional USB LiDAR can be added for area mapping. The additional payload capability allows adding on extra sensors, actuators or battery packs.

To complement information on the website, additional resources are posted on GitHub, GrabCAD and YouTube. Building a SCUTTLE robot ought to be a great group project at maker spaces wanting to get hackers started with Robotics. We have covered many Educational Robot projects in the past, but the SCUTTLE really shines with its ability to carry a pretty decent payload at a low cost.

Continue reading “Sensing, Connected, Utility Transport Taxi For Level Environments”

Making A Robotic Dog Better By Adding Springiness Without Springs

Getting a legged robot to stay upright, especially a quadruped or biped, can be a challenging undertaking. To experiment with different approaches, [James Bruton] built robot dog test platform and is playing with “dynamic compliant simulated springs“, or in other words, using the motors to act as though they were springs and dampers..

When robotic legs are kept stiff, they tend to reduce the stability of the platform due to the sudden erratic movements of the robot, especially on uneven surfaces. With a back drivable joint arrangement, [James] is using limited holding current on the motor, and the position of the motor shaft is monitored using an encoder. When a leg experiences a resisting force, with will have some “give” and then the motor will return it to it’s intended position more slowly. Using a IMU on top of the robot, it can detect when it start leaning to a side, and then temporarily soften the other side to balance the robot.

This is quite a common technique in legged robots, but [James] does an excellent job of explaining just how it works. He hopes to use the lessons learned from the test platform to improve or redesign his already impressive OpenDog.

We’ve seen a number of quadruped robots on Hackaday recently. Including Boston Dynamics’ very expensive Spot as well as a low cost robot dog that giving its big brothers a run for their money, and doing some back flips in the process. Check out James’ video after the break. Continue reading “Making A Robotic Dog Better By Adding Springiness Without Springs”