Taking (Good) Pictures Of PCBs

Snapping pictures is not technically difficult with modern technology, but taking good photographs is another matter. There are a number of things that a photographer needs to account for in order to get the best possible results, and if the subject matter isn’t particularly photogenic to start with it makes the task just a little more difficult. As anyone who’s posted something for sale online can attest, taking pictures of everyday objects can present its own challenges even to seasoned photographers. [Martijn Braam] has a few tricks up his sleeve for pictures like this in his efforts to photograph various circuit boards.

[Martijn] has been updating the images on Hackerboards, an online image reference for single-board computers and other PCBs, and he demands quality in his uploads. To get good pictures of the PCBs, he starts with ample lighting in the form of two wirelessly-controlled flashes in softboxes. He’s also using a high quality macro lens with low distortion, but the real work goes into making sure the image is sharp and the PCBs have well-defined edges. He’s using a Python script to take two pictures with his camera, and some automation in ImageMagic to composite the two images together.

While we’re not all taking pictures of PCBs, it’s a great way of demonstrating the ways that a workflow can be automated in surprising ways, not to mention the proper ways of lighting a photography subject. There are some other excellent ways of lighting subjects that we’ve seen, too, including using broken LCD monitors, or you can take some of these principles to your workspace with this arch lighting system.

3D Printer Repurposed For Light-Duty Lab Automation Tasks

Laboratory automation equipment is expensive stuff, to such a degree that small labs are often priced out of the market. That’s a shame, because there are a lot of tedious manual tasks that even modest labs would benefit from automating. Oh well — that’s what grad students are for.

But it actually isn’t that hard to bring a little automation to the lab, if you follow the lead of [Marco], [Chinna], and [Vittorio] and turn a 3D printer into a simple lab robot. That’s what HistoEnder is — a bog-standard Creality Ender 3 with a couple of special modifications that turn it into a tool for automating histology slide preparation. Histology is the study of the anatomy of tissues and uses various fixing and staining techniques to make microscopic features visible. In practice, this means moving baskets of glass slides back and forth between jars of different solutions, a job that’s perfect for a simple Cartesian gantry lab robot with a small work envelope and light loads.

None of the printer modifications are permanent; the 3D printed accessories — a hook for the slide basket and a carrier for standard histology staining jars — can quickly come off the printer to return it to its regular duty. All it takes to run HistoEnder is a bit of custom G-code and some careful alignment of the jar carrier on the print bed. We suppose the bed heater could even be used to warm up the fixing and staining solutions. There’s a brief video of HistoEnder in action embedded in the tweet below.

This isn’t the first time this team has repurposed technology for the lab — remember the fitness band that was turned into an optical densitometer?

Continue reading “3D Printer Repurposed For Light-Duty Lab Automation Tasks”

Gesture Controlled Filming Gear Works Super Intuitively

Shooting good video can be an arduous task if you’re working all by yourself. [Pave Workshop] developed a series of gesture-responsive tools to help out, with a focus on creating a simple intuitive interface.

The system is based around using a Kinect V2 to perceive gestures made by the user, which can then control various objects in the scene. For instance, a beckoning motion can instruct a camera slider to dolly forward or backwards, and a halting gesture will tell it to stop. Bringing the two hands together or apart in special gestures indicate that the camera should zoom in or out. Lights can also be controlled by pulling a fist towards or away from them to change their brightness.

The devil is in the details with a project that works this smoothly. [Pave Workshop] lays out the details on how everything Node.JS was used to knit together everything from the custom camera slider to Philips Hue bulbs and other Arduino components.

The project looks really impressive in the demo video on YouTube. We’ve seen some other impressive automated filming rigs before, too.

Continue reading “Gesture Controlled Filming Gear Works Super Intuitively”

Robots Are Folding Laundry, But They Suck At It

Robots are used in all sorts of industries on a wide variety of tasks. Typically, it’s because they’re far faster, more accurate, and more capable than we are. Expert humans could not compete with the consistent, speedy output of a robotic welder on an automotive production line, nor could they as delicately coat the chocolate on the back of a KitKat.

However, there are some tasks in which humans still have the edge. Those include driving, witty repartee, and yes, folding laundry. That’s not to say the robots aren’t trying, though, so let’s take a look at the state of the art.

Continue reading “Robots Are Folding Laundry, But They Suck At It”

Retrofitting Robots

Al Williams wrote up a neat thought piece on why we are so fascinated with robots that come in the shape of people, rather than robots that come in the shape of whatever it is that they’re supposed to be doing. Al is partly convinced that sci-fi is partly responsible, and that it shapes people’s expectations of what robots look like.

What sparked the whole thought train was the ROAR (robot-on-a-rail) style robot arms that have been popping up, at least in the press, as robot fry cooks. As the name suggests, it’s a robot arm on a rail that moves back and forth across a frying surface and uses CV algorithms to sense and flip burgers. Yes, a burger-flipping robot arm. Al asks why they didn’t just design the flipper into the stovetop, like you would expect with any other assembly line.

In this particular case, I think it’s a matter of economics. The burger chains already have an environment that’s designed around human operators flipping the burgers. A robot arm on a rail is simply the cheapest way of automating the task that fits in with the current ergonomics. The robot arm works like a human arm because it has to work in an environment designed for the human arm.

Could you redesign a new automatic burger-flipping system to be more space efficient or more reliable? Probably. If you did, would you end up with a humanoid arm? Not necessarily. But this is about patching robotics into a non-robotic flow, and that means they’re going to have to play by our rules. I’m not going to deny the cool factor of having a robot arm flip burgers, but my guess is that it’s actually the path of least resistance.

It feels kind of strange to think of a sci-fi timeline where the human-looking robots come first, and eventually get replaced by purpose-built intelligent machines that look nothing like us as the environments get entire overhauls, but that may be the way it’s going to play out. Life doesn’t always imitate art.

3d printed fish feeder system with food basin, electronic housing with red button on top and servo attached on the side. A pile of food is coming out of the 3D printed fish feeder mechanism. In the middle of the picture is a can of goldfish pellet food. On the right is a hand interacting with a propped up cell phone, setting a time.

Sleep Easy With The Fishes Well Fed

Sometimes daily tasks, like feeding pets, can feel like a real chore. To help with alleviate the mundane aspects of daily life, [Erik Berglund] has created an automatic fish feeder, complete with 3D print files, firmware, and an Android app for complete control over scheduling and feeding.

The mechanics of the fish feeder include a screw conveyor system that pushes the food pellets fed from a food store basin. The screw conveyor is driven by a Feetech FS5106R servo which provides enough force to overcome jamming that might occur with pellets getting stuck in the conveyor system. [Erik Berglund] writes that the system can dispense about 0.9 g/s and that it’s designed for granulated food, as flakes have problems because “their low density and large surface area tend to get them stuck in the throat of the hopper” — an issue that we’ve looked into previously.

[Erik Berglund] used [coberdas]’s fish feeder as the base, upgrading it with a better servo, adding a Raspberry Pi Zero W along with software for the Pi and an Android application to control the schedule of feedings. There’s also a DS1307 real time clock module to keep precision time and a push button for “manual” feeding. If you’re looking to follow along at home, you can find the Python scripts that run on the Pi and the source code for the Android application in their respective GitHub repositories.

Continue reading “Sleep Easy With The Fishes Well Fed”

Ceiling Fan Adds CO2 Sensor

Ceiling fans seem to be an oft-misunderstood or overlooked household appliance. As such, they seem to have missed a lot of the IoT wave. Sure, you can get smart controllers for them to plug into your home automation system of choice, but these mostly rely on temperature sensors, simple timers, or voice commands. There’s a lot more to a ceiling fan than maintaining a comfortable temperature, as [EJ] demonstrates with this smarter ceiling fan build.

A big part of the job of a ceiling fan is to improve air circulation, which can help a room from feeling “stuffy”. This feeling is usually caused by excess CO2 as a result of respiration in an area where the air is not moving enough to exhaust this gas. Not only does [EJ]’s controller make use of a temperature monitor for controlling the fan automatically, but there is also a CO2 sensor integrated to improve this aspect of air quality when needed.

The entire build is based on a Raspberry Pi Zero, and nothing needed to be changed about the ceiling fan itself for this added functionality because it already included a radio-based remote control. With some monitoring of the signals produced by the remote, the Raspberry Pi was programmed to mimic these commands when the surrounding sensors captured a condition where [EJ] would want the fan on. There’s also a manual control button as well, so the fan control is not entirely in the hands of the computer.

For a little more detailed information about this build, there’s a separate project page which details a lot of the information about the RF waveform capturing and recreation. And, if you want to take your fan to the next level, take a look at this one which focuses on building a smartphone app to control the fan instead.