Open Source Self-Driving Smartphone Robot

Our smartphones are incredibly powerful computers in their own right, yet we don’t often see them directly integrated into projects. Intel Intelligent Systems Lab has done exactly that with the release OpenBot, an open source smartphone based self-driving robot.

Most of the magic happens on the smartphone, which runs an app built on TensorFlow Lite, and integrates the camera and array of sensors on the smartphone, as well as the data from ultrasonic sensors and wheel encoders on the robot. The robot itself is relatively simple, with four geared DC motors, motor drivers wired to an Arduino Nano that interfaces with an Android Phone over serial.

The app created by the Intel ISL team comes preloaded with three AI models that can do either person following, or two different modes of autonomous navigation. By connecting a Bluetooth controller to the smartphone and drive the robot around manually in your specific environment while collecting data, you can train a custom autonomous driving policy to suit your environment.

This looks like an excellent way to get a taste of autonomous robots on a small budget, while still being a viable base for more demanding applications. We’ve seen only a few smartphone based robots like DriveMyPhone and SmartiPresense, which don’t have AI capabilities, but are intended for telepresence applications. We’ve always wondered why we don’t see more projects with cellphones, so we welcome the example.

Continue reading “Open Source Self-Driving Smartphone Robot”

Robots Learning To Understand Their Surroundings

Today it is pretty easy to build a robot with an onboard camera and have fun manually driving through that first-person view. But builders with dreams of autonomy quickly learn there is a lot of work between camera installation and autonomously executing a “go to chair” command. Fortunately we can draw upon work such as View Parsing Network by [Bowen Pan, Jiankai Sun, et al]

When a camera image comes into a computer, it is merely a large array of numbers representing red, green, and blue color values and our robot has no idea what that image represents. Over the past years, computer vision researchers have found pretty good solutions for problems of image classification (“is there a chair?”) and segmentation (“which pixels correspond to the chair?”) While useful for building an online image search engine, this is not quite enough for robot navigation.

A robot needs to translate those pixel coordinates into real-world layout, and this is the problem View Parsing Network offers to solve. Detailed in Cross-view Semantic Segmentation for Sensing Surroundings (DOI 10.1109/LRA.2020.3004325) the system takes in multiple camera views looking all around the robot. Results of image segmentation are then synthesized into a 2D top-down segmented map of the robot’s surroundings. (“Where is the chair located?”)

The authors documented how to train a view parsing network in a virtual environment, and described the procedure to transfer a trained network to run on a physical robot. Today this process demands a significantly higher skill level than “download Arduino sketch” but we hope such modules will become more plug-and-play in the future for better and smarter robots.

[IROS 2020 Presentation video (duration 10:51) requires free registration, available until at least Nov. 25th 2020. One-minute summary embedded below.]

Continue reading “Robots Learning To Understand Their Surroundings”

Rebuilding A Hero (the Robot, Not The Sandwich)

When [Scott Baker] found a Heathkit Hero Junior on eBay, he grabbed it. He had one as a kid, but it was long sold. The robot arrived with no electronics, so the first order of business is to give it some new modern brains including an ATMega328 and a Raspberry Pi. You can see the start of the project in the video below.

So far, you can see a nice teardown of the chassis and what’s left of the little robot’s drive system. This wasn’t the big Hero-1 that you probably remember, but it was still a pretty solid platform, especially for the time it was on the market.

Continue reading “Rebuilding A Hero (the Robot, Not The Sandwich)”

Wheels Or Legs? Why Not Both?

Out of the thousands of constraints and design decisions to consider when building a robot, the way it moves is perhaps one of the most fundamental. The method of movement constrains the design and use case for the robot perhaps more than any other parameter. A team of researchers at Texas A&M led by [Kiju Lee] is trying to have their cake and eat it too by building a robot with wheels that transform into legs, known as a-WaLTR (Adaptable Wheel-and-Leg Transformable Robot).

a-WaLTR was designed to conquer one of wheeled robots’ biggest obstacles: stairs. By adding a bit of smarts to determine whether a given terrain is better handled by wheels or legs, a-WaLTR can convert its segmented wheels into simple legs. Rather than implemented complex and error-prone articulated legs, the team stuck with robust appendages that remind us a little of whegs.

The team will show off their prototype at DARPA OFFSET Sprint-5 in February 2021, which is a program focused on building robots that can form adaptive human-swarm teams.

Thanks to the rise of 3D printers and hobbyist electronics there are more open-source experimental robot designs than ever. We’ve seen smaller versions of the famous Boston Dynamics’ Spot as well as simpler quadruped bots with more servos. a-WaLTR isn’t the first transforming robot we’ve seen, but we’re looking forward to seeing more unique takes on robotic locomotion in the future.

Thanks to [Qes] for sending this one in!

Flexible Actuators Spring Into Action

Most experiments in flexible robot actuators are based around pneumatics, but [Ayato Kanada] and [Tomoaki Mashimo] has been working on using a coiled spring as the moving component of a linear actuator. Named the flexible ultrasonic motor (FUSM), [Yunosuke Sato] built on top of their work and assembled a pair of FUSM into a closed-loop actuator with motion control in two dimensions.

A single FUSM is pretty interesting by itself, its coiled spring is the only mechanical moving part. An earlier paper published by [Kanada] and [Mashimo] laid out how to push the spring through a hole in a metal block acting as the stator of this motor. Piezoelectric devices attached to that block minutely distorts it in a controlled manner resulting in linear motion of the spring.

For closed-loop feedback, electrical resistance from the free end of the spring to the stator block can be measured and converted to linear distance to within a few millimeters. However, the acting end of the spring might be deformed via stretching or bending, which made calculating its actual position difficult. Accounting for such deformation is a future topic for this group of researchers.

This work was presented at IROS2020 which like many other conferences this year, moved online and became IROS On-Demand. After a no-cost online registration we can watch the 12-minute recorded presentation on this project or any other at the conference. The video includes gems such as an exaggerated animation of stator block deformation to illustrate how a FUSM works, and an example of the position calculation challenge where the intended circular motion actually resulted in an oval.

Speaking of conferences that have moved online, we have our own Hackaday Remoticon coming up soon!

Continue reading “Flexible Actuators Spring Into Action”

Nightmare Robot Only Moves When You Look Away

What could be more terrifying than ghosts, goblins, or clowns? How about a shapeless pile of fright on your bedroom floor that only moves when you’re not looking at it? That’s the idea behind [Sciencish]’s nightmare robot, which is lurking after the break. The Minecraft spider outfit is just a Halloween costume.

In this case, “looking at it” equates to you shining a flashlight on it, trying to figure out what’s under the pile of clothes. But here’s the thing — it never moves when light is shining on it. It quickly figures out the direction of the light source and lies in wait. After you give up and turn out the flashlight, it spins around to where the light was and starts moving in that direction.

The brains of this operation is an Arduino Uno, four light-dependent resistors, and a little bit of trigonometry to find the direction of the light source. The robot itself uses two steppers and printed herringbone gears for locomotion. Its chassis has holes in it that accept filament or wire to make a cage that serves two purposes — it makes the robot into more of an amorphous blob under the clothes, and it helps keep clothes from getting twisted up in the wheels. Check out the demo and build video after the break, because this thing is freaky fast and completely creepy.

While we usually see a candy-dispensing machine or two every Halloween, this year has been more about remote delivery systems. Don’t just leave sandwich bags full of fun size candy bars all over your porch, build a candy cannon or a spooky slide instead.

Continue reading “Nightmare Robot Only Moves When You Look Away”

Five-Axis Pumpkin Carving

The day of carved pumpkins is near, and instead of doing manually like a mere mortal, [Shane] of [Stuff Made Here] built a five-axis CNC machine to take over carving duties. (Video, embedded below.)

[Shane] initially intended to modify his barber robot, but ended up with a complete redesign, reusing only the electronics and the large ring bearing in the base. The swiveling spindle is a rotating gantry with two sets of aluminum extrusions for vertical and horizontal motion. The gantry isn’t very rigid, but it’s good enough for pumpkin carving. Software is the most challenging part of the endeavor due to the complexity of five-axis motion and mapping 2D images onto a roughly spherical surface. Cartographers have dealt with this for a long time, so [Shane] turned to Mercator projection to solve the problem. We’re also relieved to hear that we aren’t the only ones who sometimes struggle with equation-heavy Wikipedia pages.

Since there are no perfectly spherical pumpkins, [Shane] wrote a script to probe the surface of the pumpkin with a microswitch before cutting, appropriately named “TSA.exe”. The machine is capable of carving both profiles and variable depth lithophanes, mostly of [Shane]’s long-suffering wife. She seriously deserves an award for holding onto her sense of humor.

With projects like explosive baseball bats and CNC basketball hoop, the [Stuff Made Here] YouTube Channel is worth keeping an eye on.