Need A Snack From Across Town? Send Spot!

[Dave Niewinski] clearly knows a thing or two about robots, judging from his YouTube channel. Usually the projects involve robot arms mounted on some sort of wheeled platform, but this time it’s the tune of some pretty famous yellow robot legs, in the shape of spot from Boston Dynamics. The premise is simple — tell the robot what snacks you want, entirely by voice command, and off he goes to fetch. But, we’re not talking about navigating to the fridge in the same room. We’re talking about trotting out the front door, down the street and crossing roads to visit favorite restaurant. Spot will order the snacks and bring them back, fully autonomously.

Spot’s depth cameras provide localized navigation and object avoidance information
Local AI vision system handles avoiding those pesky moving objects

There are multiple things going here, all of which are pretty big computational tasks. Firstly, there is no cloud-based voice control, ala Google voice or Alexa. The robot works on the premise of full autonomy, which means no internet connectivity for any aspect. All voice recognition, voice-to-text, and speech synthesis are performed locally using the NVIDIA Riva GPU-based AI speech SDK, running on the local NVIDIA Jetson AGX Orin carried on Spot’s back. A front-facing webcam supplies the audio feed for this. The voice recognition application listens for the wake phrase, then turns the snack order into text, for later replay when it gets to the destination. Navigation is taken care of with a Microstrain RTK GNSS module, which has all the needed robustness, such as dual antennas, and inertial fallback for those regions with a spotty signal. Navigation is no use out in the real world on its own, which is where Spot’s depth sensor cameras come in. These enable local obstacle avoidance, as per the usual spot behavior we’ve all seen before. But what about crossing the road without getting tens of thousands of dollars of someone else’s hardware crushed by a passing truck? Spot’s onboard streaming cameras are fed into the NVIDIA dash cam net AI platform which enables real-time recognition of moving obstacles such as cars, humans and anything else that might be wandering around and get in the way. All in all a cool project showing the future potential of AI in robotics for important tasks, like fetching me a beer when I most need it, even if it comes from the local corner shop.

We love robots around here. Robots can mow your lawn, navigate inside your house with a little help from invisible QR Codes, even help out with growing your food. The robot-assisted future long promised, may now be looking more like the present.

Continue reading “Need A Snack From Across Town? Send Spot!”

A lock picking robot

This 3D Printed Robot Can Actually Pick Locks

Lockpicking is more of an art than a science: it’s probably 10% knowledge and 90% feeling. Only practice will teach you how much torque to apply to the cylinder, how to sense when you’ve pushed a pin far enough, or what it feels like when a pin springs back. Surely a robot would never be able to replicate such a delicate process, wouldn’t it?

Well, not according to [Lance] over at [Sparks and Code], who thought that building a lock picking robot would be an interesting challenge. He started out with a frame to hold a padlock and a servo motor to apply torque. A load cell measures the amount of force applied. This helps to keep the lock under a constant amount of tension as each pin is picked in succession. Although slow, this method seemed to work when moving the pick manually.

The difficult part was automating the pick movement. [Lance] built a clever system driven by two motors that would keep the pick perfectly straight while moving it horizontally and vertically. This was hard enough to get working correctly, but after adding a few additional clamps to remove wobble in the leadscrew, the robot was able to start picking. A second load cell inside the pick arm would detect the amount of force on each pin and work its way across the lock, pin by pin.

At least, that was the idea: as it turned out, simply dragging the pick across all pins in one go was enough to open the lock. A much simpler design could have achieved that, but no matter: designing a robot for all these intricate motions was a great learning experience anyway. It also gave [Lance] a good platform to start working on a more advanced robot that can pick higher-quality locks in which the dragging technique doesn’t work.

We haven’t come across lockpicking robots before; perhaps the closest equivalent would be this 3D-printed Snap Gun. If you’re interested in all aspects of locks and how to apply them, check out our Physical Security Hack Chat with Deviant Ollam.

Continue reading “This 3D Printed Robot Can Actually Pick Locks”

2022 Sci-Fi Contest: A Mac-Based Droid Named R.O.B.

Droids and robot assistants are still not really a part of our daily lives, even if they started showing up in movies many long decades ago. [Rudy Aramaryo] perhaps hopes that will change one day, and is pursuing this goal with their own droid build named R.O.B.

R.O.B. is quite a hefty ‘bot, weighing 140 lbs and sporting a full 80 Ah of lithium-iron-phosphate batteries for a long running time and plenty of power. For brains, R.O.B. packs in an Apple Mac Mini M1 and a Mac Studio, running OS X. It’s an unusual choice for a robot, but one that brings plenty of computing power to bear, nonetheless. Equipped with tracked propulsion, R.O.B. also features a slip-ring setup in the base allowing the droid to rotate endlessly without tangling wires.

By virtue of its size and power, R.O.B. goes a long way to emulating the general feel of the droids of the Star Wars series. It’s all about the roughly-human-scaled design, and the anthropomorphic features. Further helping the cause are a basic chat ability powered by Python, along with arms and actuators to interact with the world.

The name of this droid recalls us of the charming Nintendo console toy from the 1980s. If these aren’t the droids you’re looking for, and you’ve been hacking on ‘bots of your own, be sure to drop us a line. 

Omnibot Shows Off Over A Decade Of CNC Prowess

At first glance, you might think the Omnibot v3 wasn’t anything more than a basic 3D printed robotics platform, but you’d be wrong on both counts. There’s actually no 3D printed parts on the build, and while you could describe the platform as simplistic, calling it basic certainly doesn’t do the clever design justice. In the video after the break, creator [Michal] takes us through the process of designing and building this high quality bot.

The build starts with huge amounts of time and effort in a CAD program designing the Omnibot v3 with its four wheel steering and ability to do fancy things like spin in place. With the CAD and 3D renders out of the way, the process of transforming the digital into the physical began with a CNC router.

Rather than routing the individual components out of a suitable material, [Michal] cut forms. Those forms were made only for the creation of silicone molds. Those silicon molds where then used to pour the actual parts with polyurethane resin. It is these resin parts that make up the actual Omnibot v3, which is manually demonstrated at the end of the video.

All in all, it’s a neat project with a neat process. If we were to stop here, things would be mostly complete and you’d click on to the next great Hackaday article. But there’s more to be had here. You see, [Michal] is also fellow behind the Guerrilla guide to CNC and resin casting. In his own words: “CNC machining and resin casting are an underappreciated method for producing engineering-grade parts, but the process is fast, predictable, and garage-friendly.” After seeing the results, we can’t help but to agree.

By the way, before anybody in the comments can yell “DUPE!”, we already know. You see, we featured the Guerrilla guide to CNC and resin casting once before, almost exactly 11.5 years agoIt’s been updated since then, and appears to be an absolute gold mine of information for anybody wanting to walk in [Michal]’s shoes.

Continue reading “Omnibot Shows Off Over A Decade Of CNC Prowess”

A Line Follower With No Brains

A line follower is a common project for anyone wishing to make a start in robotics, a small wheeled device usually with some kind of optical sensor which allows it to follow a line drawn on the surface over which it runs. In most cases they incorporate a small microcontroller or perhaps an analogue computer which supplies power and steering control, but as the Crayon Car from [Greg Zumwalt] demonstrates, it’s possible to make a line follower without any brains at all.

This seemingly impossible feat is achieved thanks to the line and road surface, it runs on a piece of paper over which the line is drawn with a crayon. The robot has a single straight-line drive wheel at one end and a pair of driven rollers at 90 degrees to each other at the other end, with the magic happening due to the difference in friction between paper and crayon. The robot follows a circular track with no problem, and while we can see it’s not without flaws we doubt it would be possible to make a simpler follower.

Sharp-eyed readers will have noticed that this is not the first line follower we’ve shown you which claims to have no brains, but we’d claim that since the previous machine had an analogue circuit, this one is a more worthy contender to the crown.

Continue reading “A Line Follower With No Brains”

PicoCat, printed in yellow filament, looking at you with its ultrasonic sensor eyes

Build Your Own Cat – Some Assembly Required

Robotic pets are sci-fi material, and [Kevin McAleer] from [Kev’s Robots] is moving us all ever so closer towards a brighter, happier, more robotic future. One of his latest robot builds, PicoCat, is a robot cat with servo-driven paws. It follows in the footsteps of the OpenCat project made by Dr. Rongzhong Li back in 2016, and we’re always happy seeing someone pick up where another hacker left off. [Kevin] took heavy inspiration from the OpenCat design – rebuilding it with hardware more friendly and accessible for makers today.

Projects like these, involving data processing and calculations to get the servos moving just right, stand to benefit from the computing power of recently released RP2040 MCU. As such, the Pimoroni Servo 2040 board is a crucial component of this build, being both the brains of the project and also a PIO-boosted driver for the eleven servos helping this robot come alive. This cat’s eyes are an ultrasonic sensor, and you can add a whole lot more sensors for any robotic intention of yours. Don’t expect this kitty to jump one meter high or scratch your favourite couch to death just yet, but there’s already a lot of potential, especially coupled with a small speaker.

A PicoCat with a non-robotic kitten in the backgroundDoes this robotic cat interest you, whether it’d be due to your sci-fi propensity or a cat hair allergy? You’re in luck, because [Kevin] is keeping things firmly in the “open-source everything” realm. MicroPython code is stored in a GitHub repo, STLs are in a .zip linked on the page, and there’s plenty of renders to never leave you confused on what goes where. With all these resources, you can source the servos and the boards, fire up your 3D printer and sit down to assemble your own PicoCat. But not just that, [Kevin] also recorded three whole streams with insights, giving us over four hours of how-it-came-to-be video material for us to learn from. First, two streams of him designing the PicoCat in Fusion360, and then, him talking about the way he creates unit tests in MicroPython to improve his robots’ reliability and significantly reduce the amount of bugs cropping up.

This is not the last we will hear from [Kevin]’s robot-filled workshop, and previously, we’ve covered his Cray-1-shaped Pi Zero cluster system and a Raspberry Pi theremin, both as open and reproducible as this kitty! As you assemble yourself a PicoCat, or perhaps a Stanford Pupper or any of the other lovely quadru-pets we’ve previously featured, you might wonder how to properly move the servos, and we’ve covered a project that teaches you specifically that.

Continue reading “Build Your Own Cat – Some Assembly Required”

2022 Sci-Fi Contest: Motorized AT-AT Walker Gets Around With Servos

The AT-AT Walker was one of the more fearsome weapons of the Star Wars universe, even if it was incredibly slow and vulnerable to getting tangled up in Rebel tow cables. However, you can build your own small-scale example using servos for propulsion, as [Luke J. Barker] ably demonstrates.

Taking off the outer shell reveals the servo motors driving the leg linkages.

The build is a remix of the motorized AT-AT from [LtDan] on Thingiverse, originally powered by a 90 rpm DC gearmotor. [Luke] remixed the design, setting it up to be driven by eight servomotors instead. They’re controlled from a SparkFun RedBoard Edge, an Arduino-compatible microcontroller board that fits rather neatly inside the AT-AT shell.

Programmed with a simple sine-wave walk cycle, the AT-AT ambles along in a ponderous manner. It’s altogether very much like the real fictitious thing, albeit without the scorching sizzle of blaster fire ringing out across a frozen plain.

Quadruped vehicles never really caught on for military use, but that’s not to say nobody ever tried. Video after the break.

Continue reading “2022 Sci-Fi Contest: Motorized AT-AT Walker Gets Around With Servos”