3d printed GLaDOS home assistant

GLaDOS Voice Assistant Passive-Aggressively Automates Home

With modern voice assistants we can tell a computer to play our favorite music, check the weather, or turn on a light. Like many of us, [nerdaxic] gave in to the convenience and perceived simplicity of various home automation products made by Google and Amazon. Also like many of us, he found it a bit difficult to accept the privacy implications that surround such cloud connected devices. But after selling his Home and Echo, [nerdaxic] missed the ability to control his smart home by voice command. Instead of giving in and buying back into the closed ecosystems he’d left behind, [nerdaxic] decided to open his home to a murderous, passive aggressive, sarcastic, slightly insane AI: GLaDOS, which you can see in action after the break.

Using open source designs from fellow YouTube creator [Mr. Volt], [nerdaxic] 3d printed as much of the GLaDOS animatronic model as he was able to, and implemented much of the same hardware to make it work. [nerdaxic] put more Open Source Software to use and has created a functional but somewhat limited home AI that can manage his home automation, give the weather, and tell jokes among other things. GLaDOS doesn’t fail to deliver some great one liners inspired by the original Portal games while heeding [nerdaxic]’s commands, either.

A ReSpeaker from Seeed Studio cleans up the audio sent to a Raspberry Pi 4, and allows for future expansion that will allow GLaDOS to look in the direction of the person speaking to it. With its IR capable camera, another enhancement will allow GlaDOS to stare at people as they walk around. That’s not creepy at all, right? [nerdaxic] also plans to bring speech-to-text processing in-house instead of the Google Cloud Speech-To-Text API used in its current iteration, and he’s made everything available on GitHub so that you too can have a villainous AI hanging on your every word.

Of course if having GLaDOS looming isn’t enough, you could always build a functional life size Portal turret or listen to the radio on your very own Portal Radio.

Continue reading “GLaDOS Voice Assistant Passive-Aggressively Automates Home”

Flamethrower weedkiller mounted on a robot arm riding a tank tracked base

Don’t Sleep On The Lawn, There’s An AI-Powered, Flamethrower-Wielding Robot About

You know how it goes, you’re just hanging out in the yard, there aren’t enough hours in the day, and weeding the lawn is just such a drag. Then an idea just pops into your head. How about we attach a gas powered flamethrower to a robot arm, drive it around on a tank-tracked robotic base, and have it operate autonomously with an AI brain? Yes, that sounds like a good idea. Let’s do that. And so, [Dave Niewinski] did exactly that with his Ultimate Weed Killing Robot.

And you thought the robot overlords might take a more subtle approach and take over the world one coffee machine at a time? No, straight for the fully-autonomous flamethrower it is then.

This build uses a Kinova Robots Gen 3 six-axis arm, mounted to an Agile-X Robotics Bunker base. Control is via a Connect Tech Rudi-NX box which contains an Nvidia Jetson Xavier NX Edge AI computing engine. Wow that was a mouthful!

Connectivity from the controller to the base is via CAN bus, but, sadly no mention of how the robot arm controller is hooked up. At least this particular model sports an effector mount camera system, which can feed straight into the Jetson, simplifying the build somewhat.

To start the software side of things, [Dave] took a video using his mobile phone while walking his lawn. Next he used RoboFlow to highlight image stills containing weeds, which were in turn used to help train a vision AI system. The actual AI training was written in Python using Google Collaboratory, which is itself based on the awesome Jupyter Notebook (see also Jupyter Lab on the main site. If you haven’t tried that yet, and if you do any data science at all, you’ll kick yourself for not doing so!) Collaboratory would not be all that useful for this by itself, except that it gives you direct, free GPU access, via the cloud, so you can use it for AI workloads without needing fancy (and currently hard to get) GPU hardware on your desk.

Details of the hardware may be a little sparse, but at least the software required can be found on the WeedBot GitHub. It’s not like most of us will have this exact hardware lying around anyway. For a more complete description of this terrifying contraption, checkout the video after the break.

Continue reading “Don’t Sleep On The Lawn, There’s An AI-Powered, Flamethrower-Wielding Robot About”

Tiny ESP32 Strider Walks The Walk

Wheels might be the simplest method of locomotion for robots, but walkers are infinitely more satisfying to watch. This is certainly the case for [Chen Liang’s] tiny Strider walker controlled by a ESP32 camera board.

The Strider mechanism might look similar to Strandbeest walkers, but it lifts its feet higher, allowing it to traverse rougher terrain. [Chen]’s little 3D printed version is driven by a pair of geared N20 motors, with three legs on each side. The ESP32 camera board allows for control and an FPV video feed using WiFi, with power coming from a 14500 LiFePO4 battery. The width required by the motors, leg mechanisms, and bearings means the robot is quite wide, to the point that it could get stuck on something that’s outside the camera’s field of view. [Chen] is working to make it narrower by using continuous rotation servos and a wire drive shaft.

We’ve seen no shortage or riffs on the many-legged walkers, like the TrotBot and Strider mechanism developed by [Wade] and [Ben Vagle], and their website is an excellent resource for prospective builders.

Continue reading “Tiny ESP32 Strider Walks The Walk”

Small Scale Mad Max: Danny Huynh’s Dystopian Animatronics

The hacker spirit is always alive and well in post-apocalyptic fiction, as characters throw together contraptions from whatever junk they can find. While these might not always be practical or possible in reality, their primary purpose is usually to look the part. This is definitely the case for [Danny Huynh]’s post-apocalyptic animatronic creations, which look like they can slot straight into Mad Max or Fallout.

[Danny] is an avid RC enthusiast, so many of the models are highly customized off-the-shelf RC cars. However, it’s the lifelike moving characters in these models that really catch the eye. Their hands and feet move with the steering and throttle, and in the motorcycle builds they will often lean with the turns. Other notable builds include a hexapedal taxi and a couple of animatronic bands.

All the vehicle builds are electric, but it looks like [Danny] often includes an audio module to simulate a roaring engine. He makes extensive use of servos and linkages for character movement, with wiring and electronics carefully hidden by paint or bodywork.

With all the CGI technology available today, great animatronic builds like an eerily lifelike heart, or a talking Nikola Tesla are all the more impressive to see.

Continue reading “Small Scale Mad Max: Danny Huynh’s Dystopian Animatronics”

Animatronic Puppetry Controller Skips Joystick Or Keyboard

One of the major challenges of animatronics is creating natural looking motion. You can build something with an actuator for every possible degree of freedom, but it will still be disappointing if you are unable to control it to smoothly play the part. [Mr. Volt] has developed a passion for animatronic projects, but found programming them tedious, and manual control with keyboard or controller difficult to do right. As an alternative, he is building Waldo, an electronic puppetry controller.

The Waldo rig is being developed in conjunction with [Mr. Volt]’s build of Wheatley, the talkative ball-shaped robot from the Portal 2 game. The puppetry rig consists of a series of rings for [Mr Volt]’s hand, with the position of each being read by angle sensors. This allows him to control Wheatley’s orientation of the body and eyeball, eyelids, and handles. Wheatley and Waldo both still need a few refinements, but we look forward to seeing the finished project in action.

The Portal games have inspired several featured projects, including GLaDOS, the turrets, and of course more Wheatly builds.

Continue reading “Animatronic Puppetry Controller Skips Joystick Or Keyboard”

Smooth Servo Motion For Lifelike Animatronics

Building an animatronic robot is one thing, but animating it in a lifelike fashion is a completely different challenge. Hobby servos are cheap and popular for animatronics, but just letting it move at max speed isn’t particularly lifelike. In the video after the break, [James Bruton] demonstrates how to achieve natural motion with a simple animatronic head and a few extra lines of code.

Very little natural body movement happens at a constant speed, it’s always accelerating or decelerating. When we move our heads to look at something around us, our neck muscles accelerate our head sharply in the chosen direction and then slows down gradually as it reaches its endpoint. To do this in Arduino/C code, a new intermediate position for the servo is specified for each main loop until it reaches the final position. The intermediate value is the sum of 95% of the current position, and 5% of the target position. This gives the effect of the natural motion described above. The ratios can be changed to suit the desired speed.

The delay function is usually one of the first timing mechanisms that new Arduino programmers learn about, but it’s not suited for this application, especially when you’re controlling multiple servos simultaneously. Instead, the millis function is used to keep track of the system clock in the main loop, which fires the position update commands at the specified intervals. Adafruit wrote an excellent tutorial on this method of multitasking, which [James] based his code on. Of course, this should be old news to anyone who has been doing embedded programming for a while, but it’s an excellent introduction for newcomers.

Like most of [James]’s projects, all the code and CAD files are open source and available on GitHub. His projects make regular appearances here on Hackaday, like his mono-wheel balancing robot and mechanically multiplexed flip-dot display.

Continue reading “Smooth Servo Motion For Lifelike Animatronics”

The BornHack Badge Gets A Bubble

In a year of semiconductor shortages it’s a difficult task to deliver an electronic conference badge, so this year’s BornHack camp in Denmark had an SAO prototyping board as its badge. Some people made blinkies with theirs, but that wasn’t enough for [Inne] who had to go a step further with a light-up pneumatic bubble badge. It’s based upon a previous project producing silicone inflatable bubbles, but in a portable badge form.

On the front of the PCB is a multi-colour LED for illumination, while on the back is a small microcontroller board, a pressure sensor, and a motor driver circuit. A small air pump and battery sits in a pocket connected by a cable and a flexible tube, allowing the bubble to inflate at will. An interesting detail was the use of a cut-down hypodermic needle to carry the air through the silicone wall of the bubble. When seen up close at the camp it was an unnervingly organic effect, if there’s an uncanny valley of badges this is it.

We don’t see much in the way of soft robotics on these pages, so this happy crossover with BadgeLife is a special treat. It’s not entirely alone here though.