Cyanotype Prints On A Resin 3D Printer

Not that it’s the kind of thing that pops into your head often, but if you ever do think of a cyanotype print, it probably doesn’t conjure up thoughts of modern technology. For good reason — the monochromatic technique was introduced in the 1840s, and was always something of a niche technology compared to more traditional photographic methods.

The original method is simple enough: put an object or negative between the sun and a UV-sensitive medium, and the exposed areas will turn blue and produce a print. This modernized concept created by [Gabe] works the same way, except both the sun and the negative have been replaced by a lightly modified resin 3D printer.

A good chunk of the effort here is in the software, as [Gabe] had to write some code that would take an image and turn it into something the printer would understand. His proof of concept was a clever bit of Python code that produced an OpenSCAD script, which ultimately converted each grayscale picture to a rectangular “pixel” of variable height. The resulting STL files could be run through the slicer to produce the necessary files to load into the printer. This was eventually replaced with a new Python script capable of converting images to native printer files through UVtools.

On the hardware side, all [Gabe] had to do was remove the vat that would usually hold the resin, and replace that with a wooden lid to both hold the UV-sensitized paper in place and protect the user’s eyes. [Gabe] says there’s still some room for improvement, but you wouldn’t know it by looking at some of the gorgeous prints he’s produced already.

No word yet on whether or not future versions of the project will support direct-to-potato imaging.

Could Non-Planar Infill Improve The Strength Of Your 3D Prints?

When you’re spitting out G-Code for a 3D print, you can pick all kinds of infill settings. You can choose the pattern, and the percentage… but the vast majority of slicers all have one thing in common. They all print layer by layer, infill and all. What if there was another way?

There’s been a lot of chatter in the 3D printing world about the potential of non-planar prints. Following this theme, [TenTech] has developed a system for non-planar infill. This is where the infill design is modulated with sinusoidal waves in the Z axis, such that it forms a somewhat continuous bond between what would otherwise be totally seperate layers of the print. This is intended to create a part that is stronger in the Z direction—historically a weakness of layer-by-layer FDM parts.

Files are on Github for the curious, and currently, it only works with Prusaslicer. Ultimately, it’s interesting work, and we can’t wait to see where it goes next. What we really need is a comprehensive and scientific test regime on the tensile strength of parts printed using this technique. We’ve featured some other neat work in this space before, too. Video after the break.

Continue reading “Could Non-Planar Infill Improve The Strength Of Your 3D Prints?”

Robot Air Hockey Player Predicts Your Next Move

Air hockey is a fun game, but it’s one you can’t play by yourself. That is, unless you have a smart robot hockey player to act as your rival. [Zeroshot] built exactly that.

The build is based around a small 27-inch air hockey table—not exactly arcade-spec, but big enough to demonstrate the concepts at play. The robot player moves its mallet in the X and Y axes using a pair of NEMA17 stepper motors and an H-belt configuration. To analyze the game state, there’s a Raspberry Pi 3B fitted with a camera, and it has a top-down view of the board. The Pi gives the stepper motors commands on how to move the mallet via an Arduino that communicates with the stepper drivers.  The Pi doesn’t just aim for the puck itself, either. With Python and OpenCV, it tries to predict your own moves by tracking your mallet, and the puck, too. It predicts the very-predictable path of the puck, and moves itself to the right position for effective defence.

Believe it or not, we’ve featured quite a few projects in this vein before. They’ve all got their similarities, and their own unique quirks. Video after the break.
Continue reading “Robot Air Hockey Player Predicts Your Next Move”

Custom built RGB laser firing beam

Lasers, Galvos, Action: A Quest For Laser Mastery

If you’re into hacking hardware and bending light to your will, [Shoaib Mustafa]’s latest project is bound to spike your curiosity. Combining lasers to project multi-colored beams onto a screen is ambitious enough, but doing it with a galvanomirror, STM32 microcontroller, and mostly scratch-built components? That’s next-level tinkering. This project isn’t just a feast for the eyes—it’s a adventure of control algorithms, hardware hacks, and the occasional ‘oops, that didn’t work.’ You can follow [Shoaib]’s build log and join the journey here.

The nitty-gritty is where it gets fascinating. Shoaib digs into STM32 Timers, explaining how modes like Timer, Counter, and PWM are leveraged for precise control. From adjusting laser intensity to syncing galvos for projection, every component is tuned for maximum flexibility. Need lasers aligned? Enter spectrometry and optical diffusers for precision wavelength management. Want real-time tweaks? A Python-controlled GUI handles the instruments while keeping the setup minimalist. This isn’t just a DIY build—it’s a work of art in problem-solving, with successes like a working simulation and implemented algorithms along the way.

If laser projection or STM32 wizardry excites you, this build will inspire. We featured a similar project by [Ben] back in September, and if you dig deep into our archives, you can eat your heart out on decades of laser projector projects. Explore Shoaib’s complete log on Hackaday.io. It is—literally—hacking at its most brilliant.

Artificial Intelligence Runs On Arduino

Fundamentally, an artificial intelligence (AI) is nothing more than a system that takes a series of inputs, makes some prediction, and then outputs that information. Of course, the types of AI in the news right now can handle a huge number of inputs and need server farms’ worth of compute to generate outputs of various forms, but at a basic level, there’s no reason a purpose-built AI can’t run on much less powerful hardware. As a demonstration, and to win a bet with a friend, [mondal3011] got an artificial intelligence up and running on an Arduino.

This AI isn’t going to do anything as complex as generate images or write clunky preambles to every recipe on the Internet, but it is still a functional and useful piece of software. This one specifically handles the brightness of a single lamp, taking user input on acceptable brightness ranges in the room and outputting what it thinks the brightness of the lamp should be to match the user’s preferences. [mondal3011] also builds a set of training data for the AI to learn from, taking the lamp to various places around the house and letting it figure out where to set the brightness on its own. The training data is run through a linear regression model in Python which generates the function that the Arduino needs to automatically operate the lamp.

Although this isn’t the most complex model, it does go a long way to demonstrating the basic principles of using artificial intelligence to build a useful and working model, and then taking that model into the real world. Note also that the model is generated on a more powerful computer before being ported over to the microcontroller platform. But that’s all par for the course in AI and machine learning. If you’re looking to take a step up from here, we’d recommend this robot that uses neural networks to learn how to walk.

Hackaday Links Column Banner

Hackaday Links: October 13, 2024

So far, food for astronauts hasn’t exactly been haute cuisine. Freeze-dried cereal cubes, squeezable tubes filled with what amounts to baby food, and meals reconstituted with water from a fuel cell don’t seem like meals to write home about. And from the sound of research into turning asteroids into astronaut food, things aren’t going to get better with space food anytime soon. The work comes from Western University in Canada and proposes that carbonaceous asteroids like the recently explored Bennu be converted into edible biomass by bacteria. The exact bugs go unmentioned, but when fed simulated asteroid bits are said to produce a material similar in texture and appearance to a “caramel milkshake.” Having grown hundreds of liters of bacterial cultures in the lab, we agree that liquid cultures spun down in a centrifuge look tasty, but if the smell is any indication, the taste probably won’t live up to expectations. Still, when a 500-meter-wide chunk of asteroid can produce enough nutritionally complete food to sustain between 600 and 17,000 astronauts for a year without having to ship it up the gravity well, concessions will likely be made. We expect that this won’t apply to the nascent space tourism industry, which for the foreseeable future will probably build its customer base on deep-pocketed thrill-seekers, a group that’s not known for its ability to compromise on creature comforts.

Continue reading “Hackaday Links: October 13, 2024”

Creating Customized Diffraction Lenses For Lasers

[The Thought Emporium] has been fascinated by holograms for a long time, and in all sorts of different ways. His ultimate goal right now is to work up to creating holograms using chocolate, but along the way he’s found another interesting way to manipulate light. Using specialized diffraction gratings, a laser, and a few lines of code, he explores a unique way of projecting hologram-like images on his path to the chocolate hologram.

There’s a lot of background that [The Thought Emporium] has to go through before explaining how this project actually works. Briefly, this is a type of “transmission hologram” that doesn’t use a physical object as a model. Instead, it uses diffraction gratings, which are materials which are shaped to light apart in specific ways. After some discussion he demonstrates creating diffraction gratings using film. Certain diffraction patterns, including blocking all of the light source, can actually be used as a lens as the light bends around the blockage into the center of the shadow where there can be focal points. From there, a special diffraction lens can be built.

The diffraction lens can be shaped into any pattern with a small amount of computer code to compute the diffraction pattern for a given image. Then it’s transferred to film and when a laser is pointed at it, the image appears on the projected surface. Diffraction gratings like these have a number of other uses as well; the video also shows a specific pattern being used to focus a telescope for astrophotography, and a few others in the past have used them to create the illusive holographic chocolate that [The Thought Emporium] is working towards.

Continue reading “Creating Customized Diffraction Lenses For Lasers”