Hackaday Prize 2023: Hydrocleaner Nips Pollution In The Bud

It’s unfortunate, but a lot of trash ends up in our rivers and, eventually, our oceans. Cleaning efforts can be costly and require a lot of human power. One of the ways to keep trash out from reaching the ocean is to attack it at the river level. That’s the idea behind [Xieshi Zhang]’s Hydrocleaner, a semi-autonomous river cleaning robot.

One current method for removing trash is by remote-controlled boats with nets attached. These typically travel in one direction, sort of sweeping left and right and probably missing trash in the process.

Hydrocleaner is capable of turning back and forth, ensuring a much more complete clean-up. The camera spots trash, and the twin-pontoon design allows it to flow easily between them and into the net behind. Currently, the brain behind this boat is a Jetson Nano, although this is a work in progress. The eventual idea is that the boat would navigate itself using GNSS guidance and would navigate toward the trash.

Of course, you could always fight trash with trash.

Star Wars Pit Droid Has A Jetson Brain

In the Star Wars universe, pit droids are little foldable robots that perform automated repairs on spacecraft and the like. They were introduced in 1999’s The Phantom Menace, and beyond the podracing scenes, are probably the only good thing to come out of that particular film.

[Goran Vuksic] wanted a pit droid of his own, and reasoned that if he was going to go through the trouble of sanding and painting all the 3D printed components so they look like the real bot, he might as well add some smarts to it. While this droid won’t be fixing podracers anytime soon, its onboard Jetson Orin Nano Developer Kit does pack a considerable amount of processing under that dome.

A webcam mounted in the bot’s eye socket is connected to the Jetson, which is running an image detection and identification routine based on the example code provided by NVIDIA. The single-board computer uses a relay to blink some LEDs on and off when a human is detected, and a pair of servos pan-and-tilt the bot’s head towards whoever has caught its gaze.

It’s no surprise that [Goran] picked the Jetson Orin over competing SBCs for this task — in our review of the Orin Nano Developer Kit a few months ago, we found it was able to hit nearly 200 frames per second while performing this sort of real-time image analysis. So there’s plenty of room to grow should he want to integrate more complex image recognition tasks.

For example, he could follow in the footsteps of [Kris Kersey], and put a functional data overlay on top of the video to give his bot Iron Man vision.

Continue reading “Star Wars Pit Droid Has A Jetson Brain”

Hands-On: NVIDIA Jetson Orin Nano Developer Kit

NVIDIA’s Jetson line of single-board computers are doing something different in a vast sea of relatively similar Linux SBCs. Designed for edge computing applications, such as a robot that needs to perform high-speed computer vision while out in the field, they provide exceptional performance in a board that’s of comparable size and weight to other SBCs on the market. The only difference, as you might expect, is that they tend to cost a lot more: the current top of the line Jetson AGX Orin Developer Kit is $1999 USD

Luckily for hackers and makers like us, NVIDIA realized they needed an affordable gateway into their ecosystem, so they introduced the $99 Jetson Nano in 2019. The product proved so popular that just a year later the company refreshed it with a streamlined carrier board that dropped the cost of the kit down to an incredible $59. Looking to expand on that success even further, today NVIDIA announced a new upmarket entry into the Nano family that lies somewhere in the middle.

While the $499 price tag of the Jetson Orin Nano Developer Kit may be a bit steep for hobbyists, there’s no question that you get a lot for your money. Capable of performing 40 trillion operations per second (TOPS), NVIDIA estimates the Orin Nano is a staggering 80X as powerful as the previous Nano. It’s a level of performance that, admittedly, not every Hackaday reader needs on their workbench. But the allure of a palm-sized supercomputer is very real, and anyone with an interest in experimenting with machine learning would do well to weigh (literally, and figuratively) the Orin Nano against a desktop computer with a comparable NVIDIA graphics card.

We were provided with one of the very first Jetson Orin Nano Developer Kits before their official unveiling during NVIDIA GTC (GPU Technology Conference), and I’ve spent the last few days getting up close and personal with the hardware and software. After coming to terms with the fact that this tiny board is considerably more powerful than the computer I’m currently writing this on, I’m left excited to see what the community can accomplish with the incredible performance offered by this pint-sized system.

Continue reading “Hands-On: NVIDIA Jetson Orin Nano Developer Kit”

Laser Zaps Cockroaches Over One Meter

You may have missed this month’s issue of Oriental Insects, in which a project by [Ildar Rakhmatulin] Heriot-Watt University in Edinburgh caught our attention. [Ildar] led a team of researchers in the development of an AI-controlled laser that neutralizes moving cockroaches at distances of up to 1.2 meters. Noting the various problems using chemical pesticides for pest control, his team sought out a non-conventional approach.

The heart of the pest controller is a Jetson Nano, which uses OpenCV and Yolo object detection to find the cockroaches and galvanometers to steer the laser beam. Three different lasers were used for testing, allowing the team to evaluate a range of wavelengths, power levels, and spot sizes. Unsurprisingly, the higher power 1.6 W laser was most efficient and quicker.

The project is on GitHub (here) and the cockroach machine learning image set is available here. But [Ildar] points out in the conclusion of the report, this is dangerous. It’s suitable for academic research, but it’s not quite ready for general use, lacking any safety features. This report is full of cockroach trivia, such as the average speed of a cockroach is 4.8 km/h, and they run much faster when being zapped. If you want to experiment with cockroaches yourself, a link is provided to a pet store that sells the German Blattela germanica that was the target of this report.

If this project sounds familiar, it is because it is an improvement of a previous project we wrote about last year which used similar techniques to zap mosquitoes.

Continue reading “Laser Zaps Cockroaches Over One Meter”

Children playing a zombie shooting game on a big screen

Halloween Game Lets You Shoot Zombies With A Laser-Powered Crossbow

Suppose you were looking for all the essential elements to make a great Halloween-themed shooting game. Zombies? Check. Giant “lasers”? Check. Crossbows shooting forks? We’ve got you covered. Check out “Fork The Zombies“, which was set up by [piles.of.spam] to entertain the neighborhood kids this Halloween.

The game is played on a big screen, which shows a horde of angry zombies marching toward the player, who has to shoot as many as possible before they reach the front of the screen. The weapon provided is a crossbow; when the trigger is pulled, a fork is launched and hopefully skewers one of the ghouls. The game was written using an open-source engine called Urho3D, which takes care of all the hard-core 3D and physics work, allowing the user to focus on designing the gameplay and visuals.

A wooden crossbow game controllerTo give the game a bit more of a physical feel, [piles.of.spam] made an actual crossbow for the player to wield. Its handle was cut from a scrap piece of wood, using a band saw for the general shape and a CNC machine for the delicate cut-outs that hold a laser pointer, an ESP32 and a microswitch-based trigger. The laser shines onto the game screen, while the ESP32 sends out a data packet over WiFi when the trigger is pulled.

The location of the shot is tracked using a clever trick: a webcam is pointed at the screen, with a red color filter in front. This way, it only sees the red laser dot moving across the screen. The resulting image is processed using the Python OpenCV library, which provides functions to convert the relative motion of the pointer on the screen to an absolute position along the playing field.

A webcam on top of a Jetson Nano, with a red color filter in frontThe computing hardware consists of a pair of Jetson Nano boards, which sport quad-core ARM A57 CPUs as well as powerful graphics hardware to generate the game’s visuals. The end result is impressive, especially given the fact that all of this was designed and built in just three weeks. It was apparently a great hit with its intended audience, as visitors queued to try their hand at shooting the hungry zombies.

Laser pointers are an obvious tool for creating shooting games: we’ve seen ones with a single round target, a set of shapes set up around you, and even metal cans that fall over and stand up again. But if you need to protect yourself in case of an actual zombie apocalypse, a slingshot that shoots knives might be more useful.

Continue reading “Halloween Game Lets You Shoot Zombies With A Laser-Powered Crossbow”

Cablecam Is An Exercise In System Integration

Drones have become the standard for moving aerial camera platforms, but another option that sees use in the professional world are cable cameras. As an exercise in integrating mechanics, electronics, and software, [maxipalay] created his own Cablecam.

Cablecam is build around a pair of machined wood plates, with some pulleys and motor reduction gearing between them. A brushless hobby motor moves the platform along the rope/cable, driven a drone ESC. Since the ESC doesn’t have a reverse function, [maxipalay] used four relays controlled by an Arduino to swap around the connections of two of the motor wires to reverse direction. The main onboard controller is a Raspberry Pi, connected to a camera module mounted on a two-axis gimbal for stabilization. A GPS module was also added for positioning information on long cables.

The base station is built around an Nvidia Jetson Nano connected to a 7″ screen mounted in a plastic case. Video, telemetry and control signals are communicated using the open-source Wifibroadcast protocol. This uses off-the-shelf WiFi hardware in connectionless mode to broadcast UDP packets, and avoids the lengthy WiFi reconnection process every time a connection drops out. The motion of Cablecam can be controlled manually using a potentiometer on the control station, or use the machine vision capabilities of the Jetson to automatically track and follow people.

We’ve seen several cable robots over the years, including a solar-powered sensor platform that resembles a sloth.

Real Time Object Detection For $59

There was a time when making a machine to identify objects in a camera was difficult, even without trying to do it in real time. But now, you can do it with a Jetson Nano board for under $60. How well does it work? Watch [Murtaza’s] video below and see what you think.

The first few minutes of the video piqued our interest, and good thing, too, because the 50 lines of code get a 50-plus minute video! It is worth watching, though, because there’s a lot of good information about how to apply this technique in your own projects.

Continue reading “Real Time Object Detection For $59”