3D Printering: Treating Filament Like Paint Opens Wild Possibilities

New angles and concepts in 3D printing are always welcome, and we haven’t seen anything quite like [Horn & Rhode]’s 3D prints that do not look anything like 3D prints, accomplished with an experimental tool called HueForge. The concept behind it is simple (though not easy), and the results can be striking when applied correctly.

3D prints that really don’t look 3D-printed.

The idea is this: colored, melted filament is, in a sense, not that different from colored paint. Both come in various colors, are applied in thin layers, and blend into new colors when they do so. When applied correctly, striking imagery can emerge. An example is shown here, but there are several more both on the HueForge project page as well as models on Printables.

Instead of the 3D printer producing a 3D object, the printer creates a (mostly) flat image similar in structure to a lithophane. But unlike a lithophane, these blend colors in clever and effective ways by printing extremely thin layers in highly precise ways.

Doing this effectively requires a software tool to plan the color changes and predict how the outcome will look. It all relies on the fact that even solid-color filaments are not actually completely opaque — not when printed at a layer height of 0.08 mm, anyway — and colors will, as a result, blend into one another when layered. That’s how a model like the one shown here can get away with only a few filament changes.

Of course, this process is far from being completely automated. Good results require a solid amount of manual effort, and the transmissivity of one’s particular filament choices plays a tremendous role in how colors will actually blend. That’s where the FilaScope comes in: a tool to more or less objectively measure how well (or how poorly) a given filament transmits light. The results plug into the HueForge software to better simulate results and plan filament changes.

When done well, it’s possible to create things that look nothing at all like what we have come to expect 3D-printed things to look. The cameo proof-of-concept model is available here if you’d like to try it for yourself, and there’s also an Aztec-style carving that gives a convincing illusion of depth.

[Horn & Rhode] point out that this concept is still searching for a right-sounding name. Front-lit lithophane? Reverse lithophane? Filament painting? Color-blended bas-relief? If you have a better idea, we urge you not to keep it to yourself because [Horn & Rhode] absolutely want to hear from you.

These Illusions Celebrate Exploiting Human Senses

Illusions are perceptual experiences that do not match physical reality, and the 2023 Illusion of the Year contest produced a variety of nifty ones that are worth checking out. A video for each is embedded below the break, but we’ll briefly explain each as well.

Some of the visual illusions play with perspective. One such example happens to be the contest winner: Platform 9 3/4 has a LEGO car appear to drive directly through a wall. It happens so quickly it’s difficult to say what happened at all!

Another good one is the Tower of Cubes, which appears as two stacks of normal-looking hollow cubes, but some of the cubes are in fact truly bizarre shapes when seen from the side. This is a bit reminiscent of the ambiguous cylinder illusion by Japanese mathematician and artist [Kokichi Sugihara].

Cornelia is representative of the hollow face illusion, in which a concave face is perceived as a normal convex one. (Interestingly this illusion is used to help diagnose schizophrenia, as sufferers overwhelmingly fail to perceive the illusion.)

The Accelerando Illusion is similar to (but differs from) an auditory effect known as the Risset Rhythm by composer Jean-Claude Risset. It exploits ambiguities in sound to create a dense musical arrangement that sounds as though it is constantly increasing in tempo.

The Buddha’s Ear Illusion creates the illusion of feeling as though one’s earlobe is being stretched out to an absurd length, and brings to mind the broader concept of body transfer illusion.

While it didn’t appear into the contest, we just can’t resist bringing up the Thermal Grill Illusion, in which one perceives a painful burning sensation from touching a set of alternating hot and cold elements. Even though the temperatures of the individual elements are actually quite mild, the temperature differential plays strange tricks on perception.

A video of each of the contest’s entries is embedded below, and they all explain exactly what’s going on for each one, so take a few minutes and give them a watch. Do you have a favorite illusion of your own? Share it in the comments!

Continue reading “These Illusions Celebrate Exploiting Human Senses”

Light Meets Movement With A Minimum Of Parts

We often say that hardware hacking has never been easier, thanks in large part to low-cost modular components, powerful microcontrollers, and highly capable open source tools. But we can sometimes forget that what’s “easy” for the tinkerer that reads datasheets for fun isn’t always so straightforward for everyone else. Which is why it’s so refreshing to see projects like this LED chandelier from [MakerMan].

Despite the impressive final result, there’s no microcontrollers or complex electronics at work here. It’s been pieced together, skillfully we might add, from hardware that wouldn’t be out of place in a well-stocked parts bin. No 3D printed parts or fancy laser cutter involved, and even the bits that are welded together could certainly be fastened some other way if necessary. This particular build is not a triumph of technology, but ingenuity.

The video below is broken up roughly into two sections, the first shows how the motorized crank and pulley system was designed and tested; complete with various bits of scrip standing in for the final LED light tubes. Once the details for how it would move were nailed down, [MakerMan] switches over to producing the lights themselves, which are nothing more than some frosted plastic tubes with LED strips run down the center. Add in a sufficiently powerful 12 VDC supply, and you’re pretty much done.

As it so happens, this isn’t the first motorized lighting fixture that [MakerMan] has put together.

Continue reading “Light Meets Movement With A Minimum Of Parts”

Take A Ride In The Bathysphere

[Tom Scott] has traveled the world to see interesting things.  So when he’s impressed by a DIY project, we sit up and listen. In this case, he’s visiting the Bathysphere, a project created by a couple of passionate hobbyists in Italy. The project is housed at Explorandia, which based on google translate, sounds like a pretty epic hackerspace.

The Bathysphere project itself is a simulation of a submarine. Sounds simple, but this project is anything but.  There are no VR goggles involved.  Budding captains who are up for the challenge find themselves inside the cockpit of a mini-submarine. The sub itself is on a DIY motion platform. Strong electric motors move the system causing riders to feel like they are truly underwater. Inside the cockpit, the detail is amazing. All sorts of switches, lights, and greebles make for a realistic experience.  An electronic voice provides the ship status, and let’s the crew know of any emergencies. (Spoiler alert — there will be emergencies!)

The real gem is how this simulation operates. A Logitec webcam is mounted on an XY gantry. This camera then is dipped underwater in a small pond. Video from the camera is sent to a large monitor which serves as the sub’s window. It’s all very 1960’s simulator tech, but the effect works. The subtle movements of the simulator platform really make the users feel like they are 20,000 leagues under the sea.

Check out the video after the break for more info!

Continue reading “Take A Ride In The Bathysphere”

Supercon 2022: [Liz McFarland] Builds Golden Wings, Shows You How

Are you, by any chance, wondering about giving yourself wings? You should listen to [Liz McFarland] sharing her experience building a Wonder Woman suit, and not just any – the Golden Eagle suit from Wonder Woman 1984, adorned with a giant pair of wings. If a suit like that is in your plans, you’ll be warmly welcomed at a cosplay convention – and [Liz] had her sights on the San Diego Comic Con. With an ambitious goal of participating in the Comic Con’s cosplay contest, the suit had to be impressive – and impressive, it indeed was, not just for its looks, but for its mechanics too.

[Liz] tells us everything – from producing the wings and painting them, to keeping them attached to the body while distributing the weight, and of course, things like on-venue nuances and safety with regards to other participants. The dark side of cosplay building reality isn’t hidden either – talking, of course, about the art of staying within a reasonably tight budget. This build takes advantage of a hackerspace that [Liz] is an active member in – the [Crash Space] in LA. Everything is in – lasercutting, 3D printing, and even custom jigs for bending wing-structual PVC pipes play a role.

It would have been a travesty to not have the wings move at will, of course, and [Liz] had all the skills you could want for making the wings complete. She went for two linear actuators, walking us through the mechanical calculations and considerations required to have everything fit together. It’s not easy to build a set of wings on its own, let alone one that moves and doesn’t crumble as you use it – if you have already attempted bringing mechanical creations like this into life, you can see the value in what [Liz] shares with us, and if you haven’t yet delved into it, this video will help you avoid quite a few pitfalls while setting an example you can absolutely reach.

The suit was a resounding success at the con, and got [Liz] some well-earned awards – today, the suit’s story is here for the hackers’ world. Now, your cosplay aspirations have an inspiring real-life journey to borrow from, and we thank [Liz] for sharing it with us.

Continue reading “Supercon 2022: [Liz McFarland] Builds Golden Wings, Shows You How”

Blind Camera: Visualizing A Scene From Its Sounds Alone

A visualization by the Blind Camera based on recorded sounds and the training data set for the neural network. (Credit: Diego Trujillo Pisanty)
A visualization by the Blind Camera based on recorded sounds and the training data set for the neural network. (Credit: Diego Trujillo Pisanty)

When we see a photograph or photo of a scene, we can likely imagine what sounds would go with it, but what if this gets inverted, and we have to imagine the scene that goes with the sounds? How close would we get to reconstructing the scene in our mind, without the biases of our upbringing and background rendering this into a near-impossible task? This is essentially the focus of a project by [Diego Trujillo Pisanty] which he calls Blind Camera.

Based on video data recorded in Mexico City, a neural network created using Tensorflow 3 was trained using an RTX 3080 GPU on a dataset containing frames from these videos that were associated with a sound. As a result, when the thus trained neural network is presented with a sound profile (the ‘photo’), it’ll attempt to reconstruct the scene based on this input and its model, all of which has been adapted to run on a single Raspberry Pi 3B board.

However, since all the model knows are the sights and sounds of Mexico City, the resulting image will always be presented as a composite of scenes from this city. As [Diego] himself puts it: for the device, everything is a city. In a way it is an excellent way to demonstrate how not only neural networks are limited by their training data, but so too are us humans.

Continue reading “Blind Camera: Visualizing A Scene From Its Sounds Alone”

Detail of a circuit sculpture in the shape of a lighthouse

Op Amp Contest: This Lighthouse Sculpture Flickers In The Rhythm Of Chaos

Op amps are typically used to build signal processing circuits like amplifiers, integrators and oscillators. Their functionality can be described by mathematical formulas that have a single, well-defined solution. However, not every circuit is so well-behaved, as Leon Chua famously showed in the early 1980s: if you make a circuit with three reactive elements and a non-linear component, the resulting oscillation will be chaotic. Every cycle of the output will be slightly different from its predecessors, and the circuit might flip back and forth between different frequencies.

A circuit sculpture in the shape of a lighthouseA light modulated with a chaotic signal will appear to flicker like a candleflame, which is the effect [MaBe42] was looking for when he built a lighthouse-shaped circuit sculpture. Its five differently-colored LEDs are driven by a circuit known as Sprott’s chaotic jerk circuit. A “jerk”, in this context, is the third-order derivative of a variable with respect to time – accordingly, the circuit uses three RC integrators to implement its differential equation, along with a diode to provide nonlinearity.

The lighthouse has three chaotic oscillators, one in each of its legs. Their outputs are used to drive simple pulse-width modulators that power the LEDs in the top of the tower. [MaBe42] used the classic LM358 op amp for most of the circuits, along with 1N4148 diodes where possible and 1N4004s where needed – not for their higher power rating, but for their stronger leads. As is common in circuit sculptures, the electronic components are also part of the tower’s structure, and it needs to be quite sturdy to support its 46 cm height.

[MaBe42] used 3D printed jigs to help in assembling the various segments, testing each circuit before integrating it into the overall structure. The end result is a beautiful ornament for any electronics lab: a wireframe structure with free-hanging electronic components and randomly flickering lights on top. Want to learn more about circuit sculpture? Check out this great talk from Remoticon 2020.

Continue reading “Op Amp Contest: This Lighthouse Sculpture Flickers In The Rhythm Of Chaos”