Trees Turned Into Wind Turbines, Non-Destructively

Trees and forests are an incredibly important natural resource not only for lumber and agricultural products, but also maintain a huge amount of biodiversity in the various types of forests across the globe, stabilize their local environments, and can be protective against climate change as a way to sequester atmospheric carbon. But the one thing they don’t do is make electricity. At least, not directly. [Concept Crafted Creations] is working on solving this issue by essentially turning an unmodified tree into a kind of wind turbine.

The turbine works by first attaching a linear generator to the trunk of a tree. This generator has a hand-wound set of coils on the outside, with permanent magnets on a shaft that can travel up and down inside the set of coils. The motion to power the generator comes from a set of ropes connected high up in the tree to a tree branch. When the wind moves the branch, the ropes transfer the energy to a 3D-printed rotational mechanism that transfers this movement to a pulley attached to a gearbox which then pumps the generator up and down. The more ropes, branches, and generators attached to a tree the more electricity can be generated.

Admittedly, this project is still a proof-of-concept, although the working prototype does seem to be working on a real tree in a forest at the current time. [Concept Crafted Creations] hopes to work with others building similar devices to improve on the idea and build more refined prototypes in the future. It’s also not the only way of building a wind energy generator outside of the traditional bladed design, either. It’s possible to build a wind-powered generator with no moving parts that uses vibrations instead of rotational motion as well.

Continue reading “Trees Turned Into Wind Turbines, Non-Destructively”

Creating A Twisted Grid Image Illusion With A Diffusion Model

Images that can be interpreted in a variety of ways have existed for many decades, with the classical example being Rubin’s vase — which some viewers see as a vase, and others a pair of human faces.

When the duck becomes a bunny, if you ignore the graphical glitches that used to be part of the duck. (Credit: Steve Mould, YouTube)
When the duck becomes a bunny, if you ignore the graphical glitches that used to be part of the duck. (Credit: Steve Mould, YouTube)

Where things get trickier is if you want to create an image that changes into something else that looks realistic when you rotate each section of it within a 3×3 grid. In a video by [Steve Mould], he explains how this can be accomplished, by using a diffusion model to identify similar characteristics of two images and to create an output image that effectively contains essential features of both images.

Naturally, this process can be done by hand too, with the goal always being to create a plausible image in either orientation that has enough detail to trick the brain into filling in the details. To head down the path of interpreting what the eye sees as a duck, a bunny, a vase or the outline of faces.

Using a diffusion model to create such illusions is quite a natural fit, as it works with filling in noise until a plausible enough image begins to appear. Of course, whether it is a viable image is ultimately not determined by the model, but by the viewer, as humans are susceptible to such illusions while machine vision still struggles to distinguish a cat from a loaf and a raisin bun from a spotted dog. The imperfections of diffusion models would seem to be a benefit here, as it will happily churn through abstractions and iterations with no understanding or interpretive bias, while the human can steer it towards a viable interpretation.

Continue reading “Creating A Twisted Grid Image Illusion With A Diffusion Model”

Blowing Up Shell Scripts

One of the most universal experiences of any Linux or Unix user is working through a guide or handbook and coming across an almost unbelievably complex line of code meant to be executed with a shell. At the time of encountering a snippet like this it’s difficult to imagine any human ever having written it in the first place, but with some dedication it is possible to tease out what these small bits of code do when they’re typed into the terminal and run (unless it’s something like :(){ :|:& };: but that’s another story entirely). [noperator] recently built a tool which helps users in this predicament understand these shell scripts by expanding them into a more human-intelligible form.

The tool is named sol and does much more than expanding shell one-liners into a readable format. It also provides an interactive shell environment where the user can explore the exploded code in detail, modify it in any way they see fit, and collapse it back down to a single line so it can easily be sent to other users. It can be used with most of the major text editors as well as piped directly to standard input, and has a number of other options as well such as custom configurations and the ability to see non-standard bits of code that might not be compatible from one shell environment to another, as well as helping to translate those bits of code.

[noperator] has made the code available in the linked GitHub page for anyone curious about its use, and has a to-do list for future versions of the tool as well including adding support beyond bash. We’d definitely recommend a tool like this especially if you’re still relatively new to bash scripting (or shell scripting in general) and, as always, we’d just to remind everyone not to blindly copy and paste commands into their terminal windows. If you’re the type of person to go out on a limb and run crazy commands to see what they actually do, though, make sure you’re at least logged into the right computer first.

Upgraded Raster Laser Projector Goes RGB

We’ve covered a scanning laser project by Ben Make’s Everything last year, and now he’s back with a significant update. [Ben]’s latest project now offers a higher resolution and RGB lasers. A couple of previous versions of the device used the same concept of a rotating segmented mirror synchronised to a pulsed laser diode to create scanlines. When projected onto a suitable surface, the distorted, pixelated characters looked quite funky, but there was clearly room for improvement.

More scanlines and a faster horizontal pixel rate

The previous device used slightly inclined mirrors to deflect the beam into scanlines, with one mirror per scanline limiting the vertical resolution. To improve resolution, the mirrors were replaced with identically aligned mirrors of the type used in laser printers for horizontal scanning. An off-the-shelf laser galvo was used for vertical scanning, allowing faster scanning due to its small deflection angle. This setup is quicker than then usual vector galvo application, as the smaller movements require less time to complete. Once the resolution improvement was in hand, the controller upgrade to a Teensy 4 gave more processing bandwidth than the previous Arduino and a consequent massive improvement in image clarity.

Finally, monochrome displays don’t look anywhere near as good as an RGB setup. [Ben] utilised a dedicated RGB laser setup since he had trouble sourcing the appropriate dichroic mirrors to match available lasers. This used four lasers (with two red ones) and the correct dichroic mirrors to combine each laser source into a single beam path, which was then sent to the galvo. [Ben] tried to find a DAC solution fast enough to drive the lasers for a proper colour-mixing input but ended up shelving that idea for now and sticking with direct on-off control. This resulted in a palette of just seven colours, but that’s still a lot better than monochrome.

The project’s execution is excellent, and care was taken to make it operate outdoors with a battery. Even with appropriate safety measures, you don’t really want to play with high-intensity lasers around the house!

Here’s the previous version we covered, a neat DIY laser galvo using steppers, and a much older but very cool RGB vector projector.

Continue reading “Upgraded Raster Laser Projector Goes RGB”

FLOSS Weekly Episode 801: JBang — Not Your Parents Java Anymore

This week Jonathan Bennett and Jeff Massie chat with Max Rydahl Andersen about JBang, the cross-platform tool to run Java as a system scripting language. That’s a bit harder than it sounds, particularly to take advantage of Java’s rich debugging capabilities and the ecosystem of libraries that are available. Tune in to get the details, as well as how polyglot files are instrumental to making JBang work!

Continue reading “FLOSS Weekly Episode 801: JBang — Not Your Parents Java Anymore”

Meet The Winners Of The 2024 Tiny Games Contest

Over the years, we’ve figured out some pretty sure-fire ways to get hackers and makers motivated for contests. One of the best ways is to put arbitrary limits on different aspects of the project, such as how large it can be or how much power it can consume. Don’t believe us? Then just take a look at the entries of this year’s Tiny Games Contest.

Nearly 80 projects made it across the finish line this time, and our panel of judges have spent the last week or so going over each one to try and narrow it down to a handful of winners. We’ll start things off with the top three projects, each of which will be awarded a $150 gift certificate from our friends at DigiKey.

First: Sub-Surface Simon

While this contest saw a lot of excellent entries, we don’t think anyone is going to be surprised to see this one take the top spot. Earning an exceptionally rare perfect ten score from each of our judges, Sub-Surface Simon from [alnwlsn] grabbed onto the theme of this contest and ran like hell with it. Continue reading “Meet The Winners Of The 2024 Tiny Games Contest”

Airline Seats Are For Dummies

You normally don’t think a lot would go into the construction of a chair. However, when that chair is attached to a commercial jet plane, there’s a lot of technology that goes into making sure they are safe. According to a recent BBC article, testing involves crash dummies and robot arms.

Admittedly, these are first-class and business-class seats. Robots do repetitive mundane tasks like opening and closing the tray table many, many times. They also shoot the seats with crash dummies aboard at up to 16 Gs of acceleration. Just to put  that into perspective, a jet pilot ejecting gets about the same amount of force. A MiG-35 pilot might experience 10 G.

We didn’t realize how big the airline seat industry is in Northern Ireland. Thompson, the company that has the lab in question, is only one of the companies in the country that builds seats. Apparently, the industry suffered from the global travel slowdown during the pandemic but is now bouncing back.

While people worry about robots taking jobs, we can’t imagine anyone wanting to spend all day returning their tray table to the upright and locked position repeatedly. We certainly don’t want to be 16 G crash dummies, either.

Crash dummies have a long history, of course. Be glad airliners don’t feature ejector seats.