What do you get if you strap a microscope onto a CNC and throw in a gaming controller? The answer, according to Reddit user [AskewedBox] is something kind of awesome: you get a microscope that can be controlled with the game controller for easier tracking of tiny creepy-crawlies.
[ASkewedBox] set up this interesting combination of devices, attaching their Adonostar AD246S microscope to the stage of a no-brand 1610 CNC bought off Amazon, then connected the CNC to a computer running Universal G-Code Sender. This great open source program takes the input from an Xbox game controller and uses it to jog the CNC.
With a bit of tweaking, the game controller can now move the microscope, so it can be used to track microbes and other small creatures as they wander around on the slide mounted below the microscope eating each other. The movement of this is surprisingly smooth: the small CNC and a well-mounted microscope means that there seems to be very little wobble or backlash as the microscope moves.
[Askewedbox] hasn’t finished yet, though: in the latest update, he adds a polarizing lens to the setup and mentions that he wants to add focus control to the system, which is controlled by a remote that comes with the microscope.
There are plenty of other things that could be added beyond that, though, such as auto pan and stitch for larger photos, auto focus stacking and perhaps even auto tracking using OpenCV to track the hideous tiny creatures that live in the microscopic realm. What would you do to make this even cooler?
Robotic safety standards are designed for commercial bots, but amateur robot builders should also consider ideas like the keepout zone where a mobile robot isn’t permitted to go or how to draw out the safety perimeter space for your experimental robot arm. After all, that robot arm won’t stop crushing your fingers because you built it yourself. So, it is worth looking at the standards for industrial robots, even if your aim is fun rather than profit.
The basics of this for fixed robots like robot arms are defined in the standard R15-06. You don’t need to read the full text (because it costs $325 and is *incredibly* tedious to read), but the Association for Advancing Automation has a good background on the details. The bottom line is to ensure that a user can’t reach into an area that the robot arm might move to and provide a quick and easy way to disable the motors if someone does reach in.
Robots that move, called Industrial Mobile Robots (IMRs) or Autonomous Mobile Robots (AMRs) bring in a whole new set of problems, though, because they are designed to move around under their own control and often share space with humans. For them, the standard is called R15.08. The AGV network has a good guide to the details, but again, it boils down to two things: make sure the robot is keeping an eye on its surroundings and that it can stop quickly enough to avoid injury.
We’ve seen plenty of first-person view (FPV) robots built using the Raspberry Pi Zero, but this one from [Shane] has an interesting twist: rather than directly driving the wheels from big motors, it uses small motors and gearboxes to drive the wheels, with some of the gears being 3D printed.
[Shane] has posted the full details of this cute little robot, complete with 3D models, code, and plans for the PCB that connects the Zero to the motors. These motors are N20 ones, which are much smaller and cheaper than what we usually see used in these projects, and run faster. They also often come with a gearbox that reduces the speed to something a bit more useful. Each motor drives the two wheels on one side through a 3D printed gear for tank-style steering.
To run the whole thing off a single LiPo battery, [Shane] also designed his own Pi Hat that converted the voltage to 5 V and added a couple of H bridge chips for the motors. It is a cute little build, but the requirement for a custom Pi hat perhaps puts it beyond most beginners, who might be interested in a cheap, straightforward build like this. Does anybody have any alternatives?
Continue reading “Pi Zero FPV Robot Uses Tiny Motor & Gears”
MIT student [Anhad Sawhney] built an interesting decoration for his dorm room corridor called The Eyes of the Basilisk. Named after the mythical creature with a deadly gaze, the project monitors passers-by using thermal cameras and an LED matrix.
The project uses a thermal camera and a 64 by 64 LCD panel, with an ESP32 taking the signal from the thermal camera and processing it to find the largest hot blob in the image, which is (probably) a person. The ESP32 then displays the pixel art basilisk eye image with the iris closest to the blob’s coordinates, updating once a second. With a bit of processing to make the eye appear more spherical, it is a pretty convincing trick.
Most might have built one (or two) of the devices on a breadboard and left it at that, but [Anhad] decided to use the project as a way to teach PCB fabrication to some friends, so they created a PCB that could be mounted onto the back of the LCD matrix and built 14 of them, using the pick & place machine that he had access to at the MIT Media Lab. They then mounted all of them on the wall of his dorm room so the wall appeared to keep track of anyone walking by. I’ve never met a Basilisk, so I don’t know how many eyes they have, but it has a pretty creepy look as it watches you walking down the corridor.
Continue reading “The Eyes Of The Basilisk Are Watching You”
Hit something with a hammer, and it makes a sound. If you’re lucky, it might even make a pleasant sound, which is the idea behind the Great Stalacpipe Organ in Luray Caverns, Virginia. The organ was created in 1954 by [Leland W. Sprinkle], who noticed that some stalactites (the ones that come down from the ceiling of the cave) would make a nice, pure tone when hit.
So, he did what any self-respecting hacker would do: he picked and carved 37 to form a scale and connected them to an electronic keyboard. The resonating stalactites are spread around a 3.5 acre (14,000 square meters) cave, but because it is in a cave, the sound can be heard anywhere from within the cave system, which covers about 64 acres (260,000 square meters). That makes it the largest musical instrument in the world.
We’ll save the pedants the trouble and point out that the name is technically an error — this is not a pipe organ, which relies on air driven into resonant chambers. Instead, it is a lithophone, a percussion instrument that uses rock as the resonator. You can see one of the solenoids that hits the rock to make the sound below.
This is also the sort of environment that gives engineers nightmares: a constant drip-drip-drip of water filled with minerals that love to get left behind when the water evaporates. Fortunately, the Stalacpipe Organ seems to be in good hands: according to an NPR news story about it, the instrument is maintained by lead engineer for the caverns [Larry Moyer] and his two apprentices, [Stephanie Beahm] and [Ben Caton], who are learning the details of maintaining a complex device like this.
Continue reading “Virginia Cave Is The Largest Musical Instrument In The World”
Most of us will have at some point have bought a long power cable to charge the bike on the deck, but [Slava G. Turyshev] has a slightly more ambitious idea. In this recent paper, he outlines how an advanced civilization could use a star or two to transmit power or send signals over an interstellar distance. And his idea is also simple enough that we could do it right now, with existing technology, or detect if someone else is doing it.
Continue reading “Using Gravitational Lensing To Transmit Power And Detect Aliens”
This wall clock built by [Alf Müller] is lovely, using two NeoPixel rings to mark the time by casting light onto a 3D-printed ring. The blue shows the minutes, made more discrete by a grid inside the ring. The green shows the hours. [Alf] has provided the code so you can rework the color scheme. It might be interesting to add seconds with the red LEDs, or perhaps a countdown triggered by a touch sensor…
Continue reading “3D-Printed LED Wall Clock Does Lots With Little”