A lock picking robot

This 3D Printed Robot Can Actually Pick Locks

Lockpicking is more of an art than a science: it’s probably 10% knowledge and 90% feeling. Only practice will teach you how much torque to apply to the cylinder, how to sense when you’ve pushed a pin far enough, or what it feels like when a pin springs back. Surely a robot would never be able to replicate such a delicate process, wouldn’t it?

Well, not according to [Lance] over at [Sparks and Code], who thought that building a lock picking robot would be an interesting challenge. He started out with a frame to hold a padlock and a servo motor to apply torque. A load cell measures the amount of force applied. This helps to keep the lock under a constant amount of tension as each pin is picked in succession. Although slow, this method seemed to work when moving the pick manually.

The difficult part was automating the pick movement. [Lance] built a clever system driven by two motors that would keep the pick perfectly straight while moving it horizontally and vertically. This was hard enough to get working correctly, but after adding a few additional clamps to remove wobble in the leadscrew, the robot was able to start picking. A second load cell inside the pick arm would detect the amount of force on each pin and work its way across the lock, pin by pin.

At least, that was the idea: as it turned out, simply dragging the pick across all pins in one go was enough to open the lock. A much simpler design could have achieved that, but no matter: designing a robot for all these intricate motions was a great learning experience anyway. It also gave [Lance] a good platform to start working on a more advanced robot that can pick higher-quality locks in which the dragging technique doesn’t work.

We haven’t come across lockpicking robots before; perhaps the closest equivalent would be this 3D-printed Snap Gun. If you’re interested in all aspects of locks and how to apply them, check out our Physical Security Hack Chat with Deviant Ollam.

Continue reading “This 3D Printed Robot Can Actually Pick Locks”

2022 Sci-Fi Contest: A Friendly Wall Drawing Robot

Drawing on walls is fine for children, but adults tend to get bored quickly with such antics. Even more so when they realize who is responsible for cleaning up afterwards. Instead, consider delegating those duties to a friendly helper by the name of Fumik, as [engineer2you] has done.

Fumik, who looks like a cute little jellyfish, can draw pictures up to 5 meters wide and 3 meters high, making for a massive canvas. Powered by an Arduino Mega 2560 outfitted with a CNC shield, a pair of stepper motors drive pulleys with toothed belts to move Fumik to various positions along the wall. Another smaller stepper motor is used to drive the pen forwards and backwards as needed. Fumik can be programmed to trace out various designs in SVG format. These must be converted to code and programmed into the Arduino, at which point Fumik can begin work, drawing on the wall with its pen.

It’s a fun build, and based on photos shared by [engineer2you,] Fumik is quite able at drawing clean and neat designs without a lot of smudging or jagged lines. As a bonus, it’s easy to swap out the pen, so multicolored designs can be drawn in multiple passes.

We’ve seen other robot drawing builds before, too, like this capable portrait artist. Video after the break.

Continue reading “2022 Sci-Fi Contest: A Friendly Wall Drawing Robot”

2022 Sci-Fi Contest: A Hand-Following Robot, Powered By Arduino

If there’s one thing audiences love in sci-fi, it’s a cute robot companion that follows the heroes around. If you want one of your own, starting with this build from [mircemk] could be just the ticket.

The build relies on the classic Arduino Uno microcontroller, which talks to a HC-SR04 ultrasonic sensor module and two infrared sensors in order to track a human target and follow it around. Drive is thanks to four DC gear motors, driven by a L293D motor driver, with a two-cell lithium battery providing power for everything onboard.

The robot works in a simple manner, following a hand placed in front of the robot’s sensors. First, the robot checks for the presence of an object in front using the ultrasonic sensor. If something is detected, the twin infrared sensors mounted left and right are used to guide the robot, following the hand.

It’s not a sophisticated algorithm, and it won’t really let your robot follow you down a crowded street. However, it’s a great project to learn on for beginners and could serve as a great entry into more advanced projects using face tracking or other techniques. Video after the break. Continue reading “2022 Sci-Fi Contest: A Hand-Following Robot, Powered By Arduino”

Robotic Boat Rides High On PVC Pipe Pontoons

If you want to build your own rover, there’s plenty of cheap RC trucks out there that will provide a serviceable chassis to work with. Looking to go airborne with a custom drone? Thanks to the immense popularity of first-person view (FPV) flying, you’ll find a nearly infinite variety of affordable fixed wing and quadcopter platforms out there to chose from. But when it comes to robotic watercraft, the turn-key options aren’t nearly as plentiful; the toys are all too small, and the commercial options are priced for entities that have an R&D budget to burn. For amateur aquatic explorers, creativity is the name of the game.

Take for example this impressive vessel built by [wesgood]. With a 3D printed electronics enclosure mounted to a pair of pontoons made of cheap 4-inch PVC pipe available from the hardware store, it provides a stable platform without breaking the bank. Commercial jet drive units built into the printed tail caps for the pipes provide propulsion, and allow the craft to be steered through differential thrust. Without rudders or exposed propellers, this design is particularly well-suited for operating in shallow waters.

A removable electronics tray allows for easy access.

Perched high above the water, the electronics box contains a Raspberry Pi 2, BU353 USB GPS receiver, and a Arduino Mega 2560 paired with a custom PCB that offers up convenient ports to connect a dual-channel Cytron 3 amp motor driver and Adafruit BNO055 9-DOF IMU. Power is provided by two 6,000 mAh LiPo batteries mounted low in the pontoons, and a matching pair of Adafruit current/voltage sensors are used to keep track of the energy budget. A small USB WiFi dongle with an external antenna plugged into the Pi offers up a WiFi network that [wesgood] can connect to with an iPad for control.

If the control software for the craft looks particularly well-polished, it’s probably because [wesgood] just so happens to be a professional developer with a focus on mobile applications. While we’re a bit skeptical of using WiFi for a critical long-distance link, we can’t deny that the iPad allows for a very slick interface. In addition to showing the status of the craft’s various systems, it lets the user either take manual control or place waypoints for autonomous navigation — although it sounds like that last feature is only partially implemented right now.

We love this design, and are eager to see more as the project develops. Recently [wesgood] experimented with payloads that can be suspended from the bottom of the electronics box, specifically a sonar module for performing bathymetric observations. There’s considerable interest in crowd sourced depth maps for inland waterways, and a robotic craft that can reliably chart these areas autonomously is certainly a step up from having to collect the data manually.

Amazing “Connect Fore!” Robot Challenges Your Putting Practice

We’ve just come across [Bithead]’s amazing, robotically-automated mashup of miniature golf and Connect Four, which also includes an AI opponent who pulls no punches in its drive to win. Connect Fore! celebrates Scotland — the birthplace of golf, after all — and looks absolutely fantastic.

Scotty the AI opponent uses this robotic turret to make their moves in a game of Connect Fore!

The way it works is this: players take turns putting colored balls into one of seven different holes at the far end of the table. Each hole feeds to a clear tube — visible in the middle of the table — which represent each of the columns in a game of Connect Four.

Each player attempts to stack balls in such a way that they create an unbroken line of four in their color, either horizontally, vertically, or diagonally. In a one-player game, a human player faces off against “Scotty”, the computer program that chooses its moves with intelligence and fires balls from a robotic turret.

[Bithead] started this project as a learning experience, and being such a complex project, the write-up is extensive. We really recommend reading through the whole thing if you are at all interested in what goes into making such a project work.

What’s particularly interesting is all of the ways in which things nearly worked, or needed nudging or fine adjustment. One might think that reliably getting a ball to enter a hole and roll down a PVC tube wouldn’t be a particularly finicky task, but it turns out that all kinds of things can go wrong.

Even finding the right play surface was a challenge. [Bithead]’s first purchase from Amazon was a total waste: it looked bad, smelled bad, and balls didn’t roll well on it. There are high-quality artificial turfs out there, but the good stuff gets shockingly expensive, and such a small project pretty much pigeonholes one as a nuisance customer when it comes to vendors. The challenges [Bithead] overcame serve as a reminder to keep the 80/20 rule (or Pareto principle) in mind when estimating what will get a project to the finish line.

Right under the page break below is a brief video tour of the completed table, and after that, you can watch a game in action as [Bithead] faces off against Scotty the AI. Curious about the inner workings? The last video has some build details that fill in a few blanks from the write-up.

We’ve seen an automated Chess table before, but this is an entirely other, utterly fantastic level of work.
Continue reading “Amazing “Connect Fore!” Robot Challenges Your Putting Practice”

ElectronBot: A Sweet Mini Desktop Robot That Ticks All The Boxes

[Peng Zhihui] seems to have found some spare time and energy to crack out another sweet robot build, this time it’s a much smaller, and cuter emoji-bot (Original GitHub Link,) with the usual production-ready levels of attention to detail. With a lot of fine details in the 3D printed models, this is one for SLS printing in nylon, but that can be done for a reasonable outlay, in China at least. The electronics package consists of a few full custom, and tiny, PCBs designed with Altium Designer, with off-the-shelf modules for the circular LCD and camera. The main board hosts an STM32F405 and deals with the display and SD card, The reason for this choice of STM32 was due to the requirement for connecting to an external USB3300 high-speed USB PHY. There is a sensor PCB which handles the gesture sensor, a USB hub, MPU6050 9-axis sensor, and also the USB camera module. This board attaches to the USB-C connector in the base, via a FFC cable, allowing the robot to rotate on its base.

Cunning two-servo shoulder mechanism

[Peng] clearly has exacting standards as to how things should work, and we guess wanted to have the arms back-driveable in a way that enabled the host computer to track and record the motor positions for replaying later on. The connection back to the controller is via I2C, allowing all five servos to hang on the same bus, saving previous resources. Smart! Getting a processor and motor driver in such a tiny space was a bit of challenge, but a walk in the park for [Peng] as is demonstrates in the video embedded below (We believe English subtitles are pending!) The arm mechanism is particularly interesting, and rather elegantly executed, and he does seem rather proud of this part of the design, and so he should! Like with [Peng’s] other projects, there is a lot to see, and plenty of scope for feature explosion. It was nice to see the ‘bot being used as an input device, not only with gesture sensing via the dedicated sensor, but also using the camera with OpenCV to track user posture and act accordingly. This thing could act as genuinely useful AI device, as was a being darn cute at the same time!

We know you come to Hackaday for your cute robot fix, and we’re not going to disappoint. Here’s a cute robot lamp, an obligatory spot (a robot dog) type project, and if you’re more of a cat person, then we got that base covered as well.

Continue reading “ElectronBot: A Sweet Mini Desktop Robot That Ticks All The Boxes”

Hackaday Links Column Banner

Hackaday Links: March 13, 2022

As Russia’s war on Ukraine drags on, its knock-on effects are being felt far beyond the eastern Europe theater. And perhaps nowhere is this more acutely felt than in the space launch industry, seeing that at least until recently, Russia was pretty much everyone’s go-to ride to orbit. All that has changed now, at least temporarily, and has expanded to include halting sales of rocket engines used in other nations’ launch vehicles. Specifically, Roscosmos has put an end to exports of the RD-180 engine used in the US Atlas V launch vehicle, along with the RD-181 thrusters found in the Antares rocket. The loss of these engines may be more symbolic than practical, at least for the RD-180 — United Launch Alliance stopped selling launches on Atlas V back last year, and had secured the engines it needed for the 29 flights it has booked by that April. Still, there’s some irony that the Atlas V, which started life as an ICBM aimed at the USSR in the 1950s, has lost its Russian-made engines.

Bad news for Jan Mrázek’s popular open-source parametric search utility which made JLCPCB’s component library easier to use. We wrote about it back in 2020, and things seemed to be going fine up until this week, when Jan got a take-down request for his service. When we first heard about this, we checked the application’s web page, which bore a big red banner that included what were apparently unpleasant accusations Jan had received, including the words “reptile” and “parasitic.” The banner is still there, but the text has changed to a more hopeful tone, noting that LCSC, the component supplier for JLC’s assembly service, objected to the way Jan was pulling component data, and that they are now working together on something that everyone can be happy with. Here’s hoping that the service is back in action again soon.

Good news, everyone: Epson is getting into the 3D printer business. Eager to add a dimension to the planar printing world they’ve mostly worked in, they’ve announced that they’ll be launching a direct-extrusion printer sometime soon. Aimed at the industrial market, the printer will use a “flat screw extruder,” which is supposed to be similar to what the company uses on its injection molding machines. We sure didn’t know Epson was in the injection molding market, so it’ll be interesting to see if expertise there results in innovation in 3D printing, especially if it trickles down to the consumer printing market. Just as long as they don’t try to DRM the pellets, of course.

You can’t judge a book by its cover, but it turns out that there’s a lot you can tell about a person’s genetics just by looking at their face. At least that’s according to an AI startup called FDNA, which makes an app called “Face2Gene” that the company claims can identify 300 genetic disorders by analyzing photos of someone’s face. Some genetic disorders, like Down Syndrome, leave easily recognizable facial features, but some changes are far more subtle and hard to recognize. We had heard of cases where photos of toddlers posted on social media were used to diagnose retinoblastoma, a rare cancer of the retina. But this is on another level entirely.

And finally, working in an Amazon warehouse has got to be a tough gig, and if some of the stories are to be believed, it borders on being a horror show. But one Amazonian recently shared a video that showed what it’s like to get trapped by his robotic coworkers. The warehouse employee somehow managed to get stuck in a maze created by Amazon’s pods, which are stacks of shelves that hold merchandise and are moved around the warehouse floor by what amounts to robotic pallet jacks. Apparently, the robots know enough to not collide with their meat-based colleagues, but not enough to not box them in. To be fair, the human eventually found a way out, but it was a long search and it seems like another pod could have moved into position to block the exit at any time. You could see it as a scary example of human-robot interaction gone awry, but we prefer to look at it as the robots giving their friend a little unscheduled break away from the prying eyes of his supervisor.