Robotic Acrobot Aces The Moves

[Daniel Simu] is a performance artist, among many other things, and does acrobatic shows, quite often with a partner “flyer”. Training for his acts gets interrupted if his flyer partner is not available due to travel, injury or other reasons. This prompted him to build Acrobotics — a robotic assistant to make sure he can continue training uninterrupted.

He has some electronics and coding chops, but had to teach himself CAD so that he could do all of the design, assembly and programming himself. Acrobotics was developed as part of a Summer Sessions residency at V2_ (Lab for the Unstable Media) at Rotterdam in 2022.

The design is built around a mannequin body and things are quite simple at the moment. There are only two rotational joints for the arms at the shoulder, and no other articulations. Two car wiper motors rotate the two arms 360 deg in either direction. Continuous rotation potentiometers attached to the motors provide position feedback.

An ESP32 controls the whole thing, and the motors get juice via a pair of BTS7960 motor drivers. All of this is housed in a cage built from 15 mm aluminium extrusion and embedded in the torso of the mannequin. [Daniel] doesn’t enlighten us how the motor movements are synchronized with the music, but we do see a trailing cable attached to the mannequin. It’s likely the cable could be for power delivery, as well as some form of data or timing signals.

He’s working on the next version of the prototype, so we hope to see improved performances soon. There’s definitely scope for adding a suite of sensors – an IMU would help a lot to determine spatial orientation, maybe some ultrasonic sensors, or a LiDAR for object detection or mapping, or additional articulated joints at the elbows and wrists. We gotta love “feature creep”, right ?

Check out the two videos after the break – in the first one, he does an overview of the Acrobotics, and the second one is the actual performance that he did. Robot or not, it’s quite an amazing project and performance.
CAVEAT : We know calling this a “robot” is stretching the definition, by a lot, but we’re going to let it slip through.

Continue reading “Robotic Acrobot Aces The Moves”

Celebrating A Decade Of Bootleg Hackaday Merch

A listener of the podcast recently wrote in to tell us that, in the process of trying to purchase a legitimate Hackaday t-shirt, they discovered this 2012 Instructable from [yeltrow] that covers how you can cheaply crank out your own Wrencher shirts via screen printing.

Now historically, as long as you’re not trying to make a buck off of our name, we’ve never felt the need to stop folks from putting our logo on their projects. So we’re not too concerned that somebody was making Wrencher shirts, especially since they were almost certainly for their own personal use. Though the fact that [yeltrow] apparently described the project as a “Hackster-Style shirt” to try and avoid using our name ended up being a prophetic 4D chess meta-joke that you couldn’t make up if you tried. Continue reading “Celebrating A Decade Of Bootleg Hackaday Merch”

Fifteen Flat CRTs And A Bunch Of Magnets Make For Interactive Fun

If you were a curious child growing up when TVs were universally equipped with cathode ray tubes, chances are good that you discovered the effect a magnet can have on a beam of electrons. Watching the picture on the family TV warp and twist like a funhouse mirror was good clean fun, or at least it was right up to the point where you permanently damaged a color CRT by warping the shadow mask with a particularly powerful speaker magnet — ask us how we know.

To bring this experience to a generation who may never have seen a CRT display in their lives, [Niklas Roy] developed “Deflektron”, an interactive display for a science museum in Switzerland. The CRTs that [Niklas] chose for the exhibit were the flat-ish monochrome tubes that were used in video doorbell systems in the late 2000s, like the one [Bitluni] used for his CRT Game Boy. After locating fifteen of these things — probably the biggest hack here — they were stripped out of their cases and mounted into custom modules. The modules were then mounted into a console that looks a little like an 80s synthesizer.

In use, each monitor displays video from a camera mounted to the module. Users then get to use a selection of tethered neodymium magnets to warp and distort their faces on the screen. [Niklas] put a lot of thought into both the interactivity of the exhibit, plus the practical realities of a public installation, which will likely take quite a beating. He’s no stranger to such public displays, of course — you might remember his interactive public fountain, or this cyborg baby in a window.

Continue reading “Fifteen Flat CRTs And A Bunch Of Magnets Make For Interactive Fun”

A wall clock made from wires and electronic components

Form Follows Function In This Circuit Sculpture Clock

Electronic components are strictly functional objects: their appearance is determined by the function they’re meant to fulfil. But that doesn’t mean there’s no beauty in them. In fact, a whole discipline called circuit sculpture exists that aims to make beautiful shapes out of nothing more than electronic components and wires. Today we can show you [Maarten Tromp]’s latest work in this field: a wall-mounted clock that he’s christened the Clock Sculpture.

The clock’s main structure consists of two concentric rings made from galvanized steel wire, held together by twelve spokes. All components are soldered directly onto those two rings, with no additional mechanical support. Steel isn’t the greatest material for soldering to, but [Maarten] managed to make it work with a high-wattage soldering iron and a bit of plumbers’ flux.

The overall design is simple but clever: the outer ring holds 60 LEDs to indicate the minutes, with every fifth LED always illuminated dimly in order to provide a background reference in dark conditions. There are 24 LEDs on the inner ring to indicate the twelve hours as well as the “half-hours” in between. Without these, the dial would look a bit odd at 30 minutes past the hour.

Detail of a circuit sculpture clockA mains transformer, plus a single diode, a buffer capacitor and a 7805 regulator form a simple DC power supply, with its negative terminal connected to the steel frame. Time is kept by an ATtiny13A that counts mains frequency pulses. There’s no way to adjust the time: you’ll have to plug in the clock exactly at noon or midnight in order to synchronize it with the outside world. A crude method perhaps, but one that fits well with the clock’s bare-bones aesthetic.

The individual LEDs are driven by a set of twelve 74HC595 shift registers, all mounted dead-bug style between the two rings. Signals and power are carried between the chips by inconspicuous grey wires taken from old IDE cables; this gives the clock a clean, uncluttered appearance. [Maarten] has had the sculpture clock in his office for several months and while it apparently took some time to get used to, he claims it’s easy to read in bright and dark conditions.

Circuit sculpture has formed the basis for several stunning clock projects: this Tie Fighter-shaped clock for instance, or this insanely complex LED clock. Our 2020 Circuit Sculpture contest yielded many breathtaking designs, too.

Giving Stable Diffusion Some Depth

You’ve likely heard quite a bit of buzz over the last few months about Stable Diffusion. The new version (v2) has come out, and in addition to the standard image-to-image and text-to-image modes, it also has a depth-image-to-image that can be incredibly useful. [Andrew] has a write-up that guides you on using this mode.

The basic idea is that you can take both an image and depth into the model, which allows you to control what gets put where. Stable Diffusion is a bit confusing, but we already have some great resources to wrap your head around it. In terms of input, you can use a depth map from a camera with lidar (many recent phones include this) or have another model (like MiDaS) estimate it from a 2D picture. This becomes powerful when you can preserve a specific composition, such as an iconic scene from a well-known movie. You can keep the characters’ poses on the screen but transform the style of the scene into whatever you wish (as seen above).

We have already covered a technique to generate textures right in blender, but this new depth information has already been implemented to provide better accuracy of the textures.

[Justin Alvey] used it to create architectural photos from dollhouse furniture. Using the MiDaS model, he estimated the depth and threw away the RGB aspects by setting the denoising strength to maximum. The simplified dollhouse furniture was easily recognizable to the model, which helped produce great results.

However, the only downside is that the perspective produces a rather dollhouse feel. Changing the focal length and moving farther away helps. Overall, it’s a clever use of what the new AI model can do. It’s a fast-moving space, so this will likely be out of date in a few months.

 

More Detail On That Fantastic Lego OLED Brick

It’s always great when we get a chance to follow up on a previous project with more information, or further developments. So we’re happy that [“Ancient” James Brown] just droppedĀ a new video showing the assembly of his Lego brick with a tiny OLED screen inside it. The readers are too, apparently — we got at least half a dozen tips on this one.

We’ve got to admit that this one’s a real treat, with a host of interesting skills on display. Our previous coverage on these bedazzled bricks was disappointingly thin on details, and now the original tweets even seem to have disappeared entirely. In case you didn’t catch the original post, [James] found a way to embed a microcontroller and a remarkably small OLED screen into a Lego-compatible brick — technically a “slope 45 2×2, #3039” — that does a great job of standing in for a tiny computer monitor.

Continue reading “More Detail On That Fantastic Lego OLED Brick”

Closeup of a film restorer's hand holding a 35mm film print to check for defects as it goes into a film scanner

35mm Film Restoration Process Explained

For a large part of the 20th century, motion pictures were distributed on nitrate film. Although cheaper for the studios, this film was highly flammable and prone to decay. On top of that, most film prints were simply discarded once they had been through their run at the cinema, so a lot of film history has been lost.

Sometimes, the rolls of projected film would be kept by the projectionist and eventually found by a collector. If the film was too badly damaged to project again, it might still get tossed. Pushing against this tide of decay and destruction are small groups of experts who scan and restore these films for the digital age.

still showing the difference in quality between a 16mm print of a 35mm animated movie and a new scan of the 35mm original
The quality difference between a smaller-format print and the original restored negative can be startling

The process is quite involved – starting with checking every single frame of film by hand and repairing any damaged perforations or splices that could come apart in the scanner. Each frame is then automatically scanned at up to 10K resolution to future-proof the process before being painstakingly digitally cleaned.

The real expertise is in knowing what is damage or dirt, and what is the character of the original film. Especially in stop-motion movies, the subtle changes between frames are really part of the original, so the automatic clean-up tools need to be selectively reined in so as not to lose the charm and art of the film-makers.

The results are quite astonishing and we all have teams like this to thank for protecting our cultural heritage.

If you’re interested in watching the process, then check out the video after the break. If you fancy a go at automatic film digitising yourself (preferably not on unique historical prints!) then we’ve shown projects to do just that in the past.

Thanks to [Cliff Claven] for the tip.

Continue reading “35mm Film Restoration Process Explained”