There seem to be two camps when it comes to recipes: those based on volume-based measurements, and those based on the weight of ingredients. Gravimetric measurements have the advantage of better accuracy, but at the price of not being able to quickly scoop out a bit of this and a dash of that. It would be nice to get the convenience of volumetric measurements with the accuracy of weighing your ingredients, wouldn’t it?
It would, and that’s just what [Penguin DIY] did with this digital kitchen spoon scale. The build started with, perhaps not surprisingly, a large mixing spoon and a very small kitchen scale. The bowl of the spoon got lopped off the handle and attached to the strain gauge, which was removed from the scale along with its LCD display and circuit board. To hold everything, a somewhat stocky handle was fabricated from epoxy resin sandwiched between aluminum bolsters. Compartments for the original electronics parts, as well as a LiPo battery and USB charger module, were carved out of the resin block, and the electronics were mounted so that the display and controls are easily accessible. The video below shows the build as well as the spoon-scale in action in the kitchen.
We think this is not only a great idea but a fantastic execution. The black epoxy and aluminum look amazing together on the handle, almost like a commercial product. And sure, it would have been easy enough to build a scale from scratch — heck, you might even be able to do away with the strain gauge — but tearing apart an existing scale seems like the right move here.
It seems like the world is ready for a true 3D display. We’ve seen them in sci-fi for decades now, with the ability to view a scene from any angle and inspect it up close. They’ve remained elusive, but that might just be changing thanks to this open-source persistence-of-vision volumetric display.
If the VVD, as it has been named by its creator [Madaeon], looks somewhat familiar, perhaps it’s because editor-in-chief [Mike Szczys] ran into it back in 2019 at Maker Faire Rome. It looks like it has progressed quite a bit since then, but the basic idea is still the same. A thin, flexible membrane, which is stretched across a frame, is attached to articulated arms. The membrane can move up and down rapidly, fast enough that a 1,000-fps high-speed camera is needed to see it move. That allows you to see the magic in action; a digital light processor (DLP) module projects slices of a 3D image onto the sheet, sending the correct image out for each vertical position of the membrane. Carefully coordinating the images creates the POV illusion of a solid image floating in space, which can be observed from any angle, requires no special glasses, and can even be viewed by groups.
With displays like this, we’re used to issuing the caveat that “it no doubt looks better in person”, but we have to say in the GIFs and videos included the VVD looks pretty darn good. We think this is a natural for inclusion in the 2021 Hackaday Prize, and we’re pleased to see that it made it to the semi-finals of the “Rethink Displays” round.
[Lucas] over at Cranktown City on YouTube has been very busy lately, but despite current appearances, his latest project is not a welder. Rather, he built a very clever gas mixer for filling his homemade CO2 laser tubes, which only looks like a welding machine. (Video, embedded below.)
We’ve been following [Lucas] on his journey to build a laser cutter from scratch — really from scratch, as he built his own laser tube rather than rely on something off-the-shelf. Getting the right mix of gas to fill the tube has been a bit of a pain, though, since he was using a party balloon to collect carbon dioxide, helium, and nitrogen at measuring the diameter of the ballon after each addition to determine the volumetric ratio of each. His attempt at automating the process centers around a so-called AirShim, which is basically a flat inflatable bag made of sturdy material that’s used by contractors to pry, wedge, lift, and shim using air pressure.
[Lucas]’ first idea was to measure the volume of gas in the bag using displacement of water and some photosensors, but that proved both impractical and unnecessary. It turned out to be far easier to sense when the bag is filled with a simple microswitch; each filling yields a fixed volume of gas, making it easy to figure out how much of each gas has been dispensed. An Arduino controls the pump, which is a reclaimed fridge compressor, monitors the limit switch and controls the solenoid valves, and calculates the volume of gas dispensed.
Judging by the video below, the mixer works pretty well, and we’re impressed by its simplicity. We’d never seriously thought about building our own laser tube before, but seeing [Lucas] have at it makes it seem quite approachable. We’re looking forward to watching his laser project come together.
Bad news, Martian helicopter fans: Ingenuity, the autonomous helicopter that Perseverance birthed onto the Martian surface a few days ago, will not be taking the first powered, controlled flight on another planet today as planned. We’re working on a full story so we’ll leave the gory details for that, but the short version is that while the helicopter was undergoing a full-speed rotor test, a watchdog timer monitoring the transition between pre-flight and flight modes in the controller tripped. The Ingenuity operations team is going over the full telemetry and will reschedule the rotor test; as a result, the first flight will occur no earlier than Wednesday, April 14. We’ll be sure to keep you posted.
Anyone who has ever been near a refinery or even a sewage treatment plant will have no doubt spotted flares of waste gas being burned off. It can be pretty spectacular, like an Olympic torch, but it also always struck us as spectacularly wasteful. Aside from the emissions, it always seemed like you could at least try to harness some of the energy in the waste gasses. But apparently the numbers just never work out in favor of tapping this source of energy, or at least that was the case until the proper buzzword concentration in the effluent was reached. With the soaring value of Bitcoin, and the fact that the network now consumes something like 80-TWh a year, building portable mining rigs into shipping containers that can be plugged into gas flaring stacks at refineries is now being looked at seriously. While we like the idea of not wasting a resource, we have our doubts about this; if it’s not profitable to tap into the waste gas stream to produce electricity now, what does tapping it to directly mine Bitcoin really add to the equation?
What would you do if you discovered that your new clothes dryer was responsible for a gigabyte or more of traffic on your internet connection every day? We suppose in this IoT world, such things are to be expected, but a gig a day seems overly chatty for a dryer. The user who reported this over on the r/smarthome subreddit blocked the dryer at the router, which was probably about the only realistic option short of taking a Dremel to the WiFi section of the dryer’s control board. The owner is in contact with manufacturer LG to see if this perhaps represents an error condition; we’d actually love to see a Wireshark dump of the data to see what the garrulous appliance is on about.
As often happens in our wanderings of the interwebz to find the very freshest of hacks for you, we fell down yet another rabbit hole that we thought we’d share. It’s not exactly a secret that there’s a large number of “Star Trek” fans in this community, and that for some of us, the way the various manifestations of the series brought the science and technology of space travel to life kick-started our hardware hacking lives. So when we found this article about a company building replica Tricorders from the original series, we followed along with great interest. What we found fascinating was not so much the potential to buy an exact replica of the TOS Tricorder — although that’s pretty cool — but the deep dive into how they captured data from one of the few remaining screen-used props, as well as how the Tricorder came to be.
And finally, what do you do if you have 3,281 drones lying around? Obviously, you create a light show to advertise the launch of a luxury car brand in China. At least that’s what Genesis, the luxury brand of carmaker Hyundai, did last week. The display, which looks like it consisted mostly of the brand’s logo whizzing about over a cityscape, is pretty impressive, and apparently set the world record for such things, beating out the previous attempt of 3,051 UAVs. Of course, all the coverage we can find on these displays concentrates on the eye-candy and the blaring horns of the soundtrack and gives short shrift to the technical aspects, which would really be interesting to dive into. How are these drones networked? How do they deal with latency? Are they just creating a volumetric display with the drones and turning lights on and off, or are they actually moving drones around to animate the displays? If anyone knows how these things work, we’d love to learn more, and perhaps even do a feature article.
3D-scanning seems like a straightforward process — put the subject inside a motion control gantry, bounce light off the surface, measure the reflections, and do some math to reconstruct the shape in three dimensions. But traditional 3D-scanning isn’t good for subjects with complex topologies and lots of nooks and crannies that light can’t get to. Which is why volumetric 3D-scanning could become an important tool someday.
As the name implies, volumetric scanning relies on measuring the change in volume of a medium as an object is moved through it. In the case of [Kfir Aberman] and [Oren Katzir]’s “dip scanning” method, the medium is a tank of water whose level is measured to a high precision with a float sensor. The object to be scanned is dipped slowly into the water by a robot as data is gathered. The robot removes the object, changes the orientation, and dips again. Dipping is repeated until enough data has been collected to run through a transformation algorithm that can reconstruct the shape of the object. Anywhere the water can reach can be scanned, and the video below shows how good the results can be with enough data. Full details are available in the PDF of their paper.
While optical 3D-scanning with the standard turntable and laser configuration will probably be around for a while, dip scanning seems like a powerful method for getting topological data using really simple equipment.
There’s an especially large focus on 3D displays. Projecting onto screens, droplets of water, spinning objects, and even plasma combustion are covered. But so are the funny physical displays: flip-dots, pin-cushions, and even servo-driven “pixels”.
Flavien Théry’s La Porte
We really liked the section on LCDs with modified polarization layers — we’ve seen some cool hacks using that gimmick, but the art pieces he dredged up look even better. Makes us want to take a second look at that busted LCD screen in the basement.
We’re big fans of the bright and blinky, so it’s no surprise that [Blair] got a bunch of his examples from these very pages. And we’ve covered [Blair]’s work as well: both his Wobbulator and his “Color a Sound” projects. Hackaday: your one-stop-shop for freaky pixels.
[Blair]’s list looks pretty complete to us, but there’s always more out there. What oddball displays are missing? What’s the strangest or coolest display you’ve ever seen?
There’s a new display technique that’s making the blog rounds, and like anything that seems like its torn from [George Lucas]’ cutting room floor, it’s getting a lot of attention. It’s a device that can display voxels in midair, forming low-resolution three-dimensional patterns without any screen, any fog machine, or any reflective medium. It’s really the closest thing to the projectors in a holodeck we’ve seen yet, leading a few people to ask how it’s done.
This isn’t the first time we’ve seen something like this. A few years ago. a similar 3D display technology was demonstrated that used a green laser to display tens of thousands of voxels in a display medium. The same company used this technology to draw white voxels in air, without a smoke machine or anything else for the laser beam to reflect off of. We couldn’t grasp how this worked at the time, but with a little bit of research we can find the relevant documentation.
A system like this was first published in 2006, built upon earlier work that only displayed pixels on a 2D plane. The device worked by taking an infrared Nd:YAG laser, and focusing the beam to an extremely small point. At that point, the atmosphere heats up enough to turn into plasma and turns into a bright, if temporary, point of light. With the laser pulsing several hundred times a second, a picture can be built up with these small plasma bursts.
Moving a ball of plasma around in 2D space is rather easy; all you need are a few mirrors. To get a third dimension to projected 3D images, a lens mounted on a linear rail moves back and forth changing the focal length of the optics setup. It’s an extremely impressive optical setup, but simple enough to get the jist of.
Having a device that projects images with balls of plasma leads to another question: how safe is this thing? There’s no mention of how powerful the laser used in this device is, but in every picture of this projector, people are wearing goggles. In the videos – one is available below – there is something that is obviously missing once you notice it: sound. This projector is creating tiny balls of expanding air hundreds of times per second. We don’t know what it sounds like – or if you can hear it at all – but a constant buzz would limit its application as an advertising medium.
As with any state-of-the-art project where we kinda know how it works, there’s a good chance someone with experience in optics could put something like this together. A normal green laser pointer in a water medium would be much safer than an IR YAG laser, but other than that the door is wide open for a replication of this project.