Browser-Based Robot Dog Simulator In ~800 Lines Of Code

[Sergii] has been learning about robot simulation and wrote up a basic simulator for a robodog platform: the Unitree A1. It only took about 800 lines of code to do so, which probably makes it a good place to start if one is headed in a similar direction.

Right now, [Sergii]’s simulator is an interactive physics model than runs in the browser. Software-wise, once the model of the robot exists the Rapier JavaScript physics engine takes care of the physics simulation. The robot’s physical layout comes from the manufacturer’s repository, so it doesn’t need to be created from scratch.

To make the tool useful, the application has two models of the robot, side by side. The one on the left is the control model, and has interactive sliders for limb positions. All movements on the control model are transmitted to the model on the right, which is the simulation model, setting the pose. The simulation model is the one that actually models the physics and gravity of all the desired motions and positions. [Sergii]’s next step is to use the simulator to design and implement a simple walking gait controller, and we look forward to how that turns out.

If Unitree sounds familiar to you, it might be because we recently covered how an unofficial SDK was able to open up some otherwise-unavailable features on the robodogs, so check that out if you want to get a little more out of what you paid for.

Ask Hackaday: What’s The Deal With Humanoid Robots?

When the term ‘robot’ gets tossed around, our minds usually race to the image of a humanoid machine. These robots are a fixture in pop culture, and often held up as some sort of ideal form.

Yet, one might ask, why the fixation? While we are naturally obsessed with recreating robots in our own image, are these bipedal machines the perfect solution we imagine them to be?

Continue reading “Ask Hackaday: What’s The Deal With Humanoid Robots?”

A New Educational Robotics Platform

When looking for electronics projects to use in educational settings, there is no shortage of simple, lightweight, and easily-accessible systems to choose from. From robotic arms, drones, walking robots, and wheeled robots, there is a vast array of options. But as technology marches on, the robotics platforms need to keep up as well. This turtle-style wheeled robot called the Trundlebot uses the latest in affordable microcontrollers on a relatively simple, expandable platform for the most up-to-date educational experience.

The robot is built around a Raspberry Pi Pico, with two low-cost stepper motors to drive the wheeled platform. The chassis can be built out of any material that can be cut in a laser cutter, but for anyone without this sort of tool it is also fairly easy to cut the shapes out by hand. The robot’s functionality can be controlled through Python code, and it is compatible with the WizFi360-EVB-Pico which allows it to be remote controlled through a web application. The web interface allows easy programming of commands for the Trundlebot, including a drag-and-drop feature for controlling the robot.

With all of these features, wireless connectivity, and a modern microcontroller at the core, it is an excellent platform for educational robotics. From here it wouldn’t be too hard to develop line-follower robots, obstacle-avoiding robots, or maze-solving robots. Other components can easily be installed to facilitate these designs as well. If you’re looking for a different style robot, although not expressly for educational purposes this robotic arm can be produced for under $60.

Steel For Your Fighting Robot

The job of processing video after a large event must be a thankless one for whichever volunteer upon whose shoulders it falls, and thus it’s not unusual for talks at larger events to end up online much later than the event itself. Electromagnetic Field 2022 was last year, but they have continued to drop new videos. Among the latest batch is one from [Jennifer Herchenroeder], in which she discusses the steel used in her team’s BattleBot, Hijinx (Edit: her EMF talk was cut short due to time pressures, so she re-recorded it in full after the event and we’ve replaced the link. The EMF video meanwhile is here). The result is a fascinating introduction to the metallurgy of iron and steel, and is well worth a watch.

To fully understand the selection of armor steel it’s necessary to start from first principles with iron, to look at its various allotropes, and understand something of how those allotropes form and mix in the steel making and metalworking processes. We’re treated to a full description of the various tempering and hardening processes, before a panel-by-panel rundown of the various steels used by Hijinx.

For a Hackaday writer with a past in robot combat it’s fascinating to see how the design of robots has evolved over the decades since the British Robot Wars, and it’s particularly nice to see the current generation as part of our community. However, if you’ve tempted yourself, bear in mind that it’s not all plain sailing.

Continue reading “Steel For Your Fighting Robot”

Mapping The Depths With An Autonomous Solar Boat

Ever look out at a pond, stream, or river, and wonder how deep it is? For large bodies of water that are considered navigable, it’s easy enough to pull up a chart and find out. But what if there’s no public data for the area you’re interested in?

Well, you could spend all day on a little boat taking depth readings and making your own chart, but if you’re anything like [Clay] you could build a solar-powered autonomous robot to do it for you. He’s been working on the boat, which he calls Gumption Trap, for the better part of a year now. If we had to guess, we’d say the experience of designing and building it has ended up being a bit more interesting to him than the actual depth of the water — but that’s fine by us.

The design of the boat is surprisingly economical, as far as marine designs go. Two capped four-inch PVC pipes are used as pontoons, and 3D printed brackets attach those to an aluminum extrusion frame that holds the electronics and solar panel high above the water. This arrangement provides an exceptionally stable platform that would be all but impossible to flip under normal circumstances.

Around the back of the craft, there’s a pair of massive 3D printed thrusters, complete with some remarkably chunky printed propellers. The lack of rudders keeps things simple, with differential thrust between the two motors enough to keep the Gumption pointed in the right direction.

Continue reading “Mapping The Depths With An Autonomous Solar Boat”

Tentacle Robot Is Like An Elephant Trunk

It sounds like bad science fiction or anime, but researchers are creating helical-artificial fibrous muscle structured tubular soft actuators. What? Oh, tentacle robot arms. Got it.

The researchers at Westlake University in China found inspiration in elephant trunks. Elephant trunks are entirely devoid of bone but use a tubular muscle structure. By deforming certain muscles, complex motion is possible. After understanding how they work, it was just a matter of making a similar structure from artificial muscle fibers.

The resulting actuator uses smart materials and has eleven different morphing modes — more than other attempts to build similar structures. The fabrication sounds difficult, it involves stretching chemically reactive materials over a form with specific winding angles.

The fibers react to light. Depending on the configuration, the stalk can seek light or avoid light. We were hoping the “Materials and Methods” section would give some ideas of how to do this ourselves, but it looks like you’d need some uncommon liquid crystal materials, and you’d also have to work out some of the details.

Animatronic tentacles are usually complex cable affairs. However, we have seen some soft robots in the past, too.

A man next to a robot with animatronic eyes and a CRT display showing an audio waveform

Animatronic Alexa Gives Amazon’s Echo A Face

Today, we’re surrounded by talking computers and clever AI systems that a few decades ago only existed in science fiction. But they definitely looked different back then: instead of a disembodied voice like ChatGPT, classic sci-fi movies typically featured robots that had something resembling a human body, with an actual face you could talk to. [Thomas] over at Workshop Nation thought this human touch was missing from his Amazon Echo, and therefore set out to give Alexa a face in a project he christened Alexatron.

The basic idea was to design a device that would somehow make the Echo’s voice visible and, at the same time, provide a pair of eyes that move in a lifelike manner. For the voice, [Thomas] decided to use the CRT from a small black-and-white TV. By hooking up the Echo’s audio signal to the TV’s vertical deflection circuitry, he turned it into a rudimentary oscilloscope that shows Alexa’s waveform in real time. An acrylic enclosure shields the CRT’s high voltage while keeping everything inside clearly visible.

To complete the face, [Thomas] made a pair of animatronic eyes according to a design by [Will Cogley]. Consisting of just a handful of 3D-printed parts and six servos, it forms a pair of eyes that can move in all directions and blink just like a real person. Thanks to a “person sensor,” which is basically a smart camera that detects faces, the eyes will automatically follow anyone standing in front of the system. The eyes are closed when the system is dormant, but they will open and start looking for faces nearby when the Echo hears its wake word, just like a human or animal responds to its name.

The end result absolutely looks the part: we especially like the eye tracking feature, which gives it that human-like appearance that [Thomas] was aiming for. He isn’t the first person trying to give Alexa a face, though: there are already cute Furbys and creepy bunnies powered by Amazon’s AI, and we’ve even seen Alexa hooked up to an animatronic fish.

Continue reading “Animatronic Alexa Gives Amazon’s Echo A Face”