Arduino-Powered Trap Hopes To Catch Mice

The old adage that you’ll make a fortune by developing a better mouse trap is not super realistic, as the engineers behind Sony’s Betamax video tape standard could tell you. However, you can still learn a lot building your own, as this project from [ROBO HUB] demonstrates.

The trap is intended to catch mice in a humane fashion, without injury to the animal. To that end, it uses an Arduino Nano armed with an ultrasonic distance sensor  to detect when mice have entered a plastic container. The container’s hinged door is is held open with a servo. When a mouse is detected, the servo trips the door to snap shut under the power of an elastic band.

The key to making this design work well is ensuring that there are no gaps in the closed container that the mouse can use to escape. They’re wily creatures able to squeeze through positively tiny spaces, so it’s important to get this right. Besides that, you want to check the trap regularly, lest any caught mice simply claw and chew their way out.

We’ve seen a few mousetraps around these parts before, too. Video after the break.

Continue reading “Arduino-Powered Trap Hopes To Catch Mice”

High Quality 3D Scene Generation From 2D Source, In Realtime

Here’s some fascinating work presented at SIGGRAPH 2023 of a method for radiance field rendering using a novel technique called Gaussian Splatting. What’s that mean? It means synthesizing a 3D scene from 2D images, in high quality and in real time, as the short animation shown above shows.

Neural Radiance Fields (NeRFs) are a method of leveraging machine learning to, in a way, do what photogrammetry does: synthesize complex scenes and views based on input images. But NeRFs work in a fraction of the time, and require only a fraction of the source material. There are different ways to go about this and unsurprisingly, there tends to be a clear speed vs. quality tradeoff. But as the video accompanying this new work seems to show, clever techniques mean the best of both worlds.

A short video summary is embedded just below the page break. Interested in deeper details? The research PDF is here. The amount of development this field has seen is nothing short of staggering, and certainly higher in quality than what was state-of-the-art for NeRFs only a year ago.

Continue reading “High Quality 3D Scene Generation From 2D Source, In Realtime”

Is A Pigeon Faster Than The Internet?

[Jeff Geerling]’s latest project is for the birds — literally. Even though he has a brand new high-speed fiber optic internet connection, online backups of YouTube video projects still take hours. He decided to see if the conclusions from a 2009 in South Africa study still hold true today — that using carrier pigeons to send files can be faster than the internet. [Jeff] sets up an experiment to send 3 TB of data by homing pigeon a distance of one mile to establish a baseline. Next, [Jeff] sends the same 3 TB of data over the internet, and donning the cap of honorary pigeon, simultaneously embarks on a journey by air to his off-site backup service in Nova Scotia, Canada.

Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway.

[Jeff] points out that you also have to consider the transfer time of your files onto and from the pigeon-suitable memory cards. He jumped through several hoops to minimize that, but it still consumed 2-1/2 hours total. Trying to keep the comparison fair, he also spent a couple days optimizing his internet connection to eek out the best possible speed. Continue reading “Is A Pigeon Faster Than The Internet?”

Solar Powered Flower Chases The Light

Many plants are capable of tracking the sun in order to get the most possible light. [hannu_hell] built a solar powered sculpture that replicates this light sensitivity for the benefit of better charging its own batteries, allowing it to run theoretically indefinitely where suitable light was available.

The 3D-printed flower features six movable petals mounted on an articulated stem. The flower’s leaves themselves bear solar panels that collect energy, analogous to leaves on a plant. A Raspberry Pi Pico is at the heart of the show, which is outfitted with a DS1307 real-time clock and a ST7735 TFT display for displaying date and time information. It’s also responsible for controlling servos that aim the flower’s solar panels towards the brightest light source available. This is achieved by using the Pico to read several photoresistors to determine light levels and adjust the leaves accordingly.

It’s a fun build, and one that could teach useful lessons relevant to even large-scale solar arrays. Video after the break.

Continue reading “Solar Powered Flower Chases The Light”

Will An 8088 Run DOOM? Now, Yes It Will!

The question on everyone’s lips when a new piece of hardware comes out is this: Will it run DOOM? Many pieces of modern hardware have been coaxed into playing id Software’s 1993 classic, but there have always been some older machines that just didn’t have the power to do it. One of them has now been conquered though, and it’s a doozy. [Frenkel]’s Doom8088, as its name suggests, is a port of the game for the original PC and AT.

As can be seen in this gameplay video, it’s not always the slickest of gaming experiences. But it works, so the question is, how on earth can a machine that was below the spec of the original, run this game? The answer comes in it being a port of GBADoom for the Game Boy Advance, a platform with less memory than a DOS PC. It still relies on extensive hard disk access for every frame though, which leaves it snail-like.

We set out to install it ourselves on one of the web based PC emulators, but fell over on the size of the required Watcom installation. If any of you have the real thing lying around though, we’d love to hear about how the game performed in the comments.

We’ve shown you so many ports of DOOM over the years to have lost count. One of our favourite recent ones uses an extremely unconventional but very retro display.

To Give Is Better Than To Receive

Better to give a talk at a hacker event, that is. Or in your hackerspace, or even just to a bunch of fellow nerds whenever you can. When you give the talk, don’t be afraid to make it too “easy” to understand. Making a tough topic comprehensible is often the sign that you really understand it, after all, and it’s also a fantastic service to the audience. And also don’t be afraid that your talk isn’t “hard core” enough, because with a diverse enough crowd, there will absolutely be folks for whom it’s still entirely new, and they’ll be thankful.

These were the conclusions I got from talking to a whole range of people at Chaos Communication Camp the weekend before last, and it’s one of the great opportunities when you go to an event like this. At Camp, there were a number of simultaneous stages, and with so many talks that new ones are still being released. That meant that everyone had their chance to say their bit, and many many did.

And that’s great. Because it’s obvious that getting the work done, or diving deep into a particular topic, is part of the hacker experience, but it’s also equally important to share what you’ve gained with the rest of the community. The principle of spreading the knowledge is a cornerstone of our culture, and getting people up to talk about what they’ve learned is the manifestation of this cultural value. If you know something, say something!

Of course, when you’re not at a conference, you could be writing up your hacks and sending them in to the tips line (hint, hint!). That’ll work too.

Teaching A Mini-Tesla To Steer Itself

At the risk of stating the obvious, even when you’ve got unlimited resources and access to the best engineering minds, self-driving cars are hard. Building a multi-ton guided missile that can handle the chaotic environment of rush-hour traffic without killing someone is a challenge, to say the least. So if you’re looking to get into the autonomous car game, perhaps it’s best to start small.

If [Austin Blake]’s fun-sized Tesla go-kart looks familiar, it’s probably because we covered the Teskart back when he whipped up this little demon of an EV from a Radio Flyer toy. Adding self-driving to the kart is a natural next step, so [Austin] set off on a journey into machine learning to make it happen. Having settled on behavioral cloning, which trains a model to replicate a behavior by showing it examples of the behavior, he built a bolt-on frame to hold a steering servo made from an electric wheelchair motor, some drive electronics, and a webcam attached to a laptop. Ten or so human-piloted laps around a walking path at a park resulted in a 48,000-image training set, along with the steering wheel angle at each point.

The first go-around wasn’t so great, with the Teskart seemingly bent on going off the track. [Austin] retooled by adding two more webcams, to get a little parallax data and hopefully improve the training data. After a bug fix, the improved model really seemed to do the trick, with the Teskart pretty much keeping in its lane around the track, no matter how fast [Austin] pushed it. Check out the video below to see the Teskart in action.

It’s important to note that this isn’t even close to “Full Self-Driving.” The only thing being controlled is the steering angle; [Austin] is controlling the throttle himself and generally acting as the safety driver should the car veer off course, which it tends to do at one particular junction. But it’s a great first step, and we’re looking forward to further development.

Continue reading “Teaching A Mini-Tesla To Steer Itself”