On 3D Scanners And Giving Kinects A New Purpose In Life

The concept of a 3D scanner can seem rather simple in theory: simply point a camera at the physical object you wish to scan in, rotate around the object to capture all angles and stitch it together into a 3D model along with textures created from the same photos. This photogrammetry application is definitely viable, but also limited in the sense that you’re relying on inferring three-dimensional parameters from a set of 2D images and rely on suitable lighting.

To get more detailed depth information from a scene you’d need to perform direct measurements, which can be done physically or through e.g. time-of-flight (ToF) measurements. Since contact-free ways of measurements tend to be often preferred, ToF makes a lot of sense, but comes with the disadvantage of measuring of only a single spot at a time. When the target is actively moving, you can fall back on photogrammetry or use an approach called structured-light (SL) scanning.

SL is what consumer electronics like the Microsoft Kinect popularized, using the combination of a visible and near-infrared (NIR) camera to record a pattern projected onto the subject, which is similar to how e.g. face-based login systems like Apple’s Face ID work. Considering how often Kinects have been used for generic purpose 3D scanners, this raises many questions regarding today’s crop of consumer 3D scanners, such as whether they’re all just basically Kinect-clones.

Continue reading “On 3D Scanners And Giving Kinects A New Purpose In Life”

The Internet We Didn’t Get

Collective human consciousness is full of imagined or mythical dream-like utopias, hidden away behind mountains, across or under oceans, hidden in mist, or deep in the jungle. From Atlantis, Avalon, El Dorado, and Shangri-La, we have not stopped imagining these secret, fantastical places. One of these, Xanadu, is actually a real place but has been embellished over the years into a place of legend and myth, and thus became the namesake of an Internet we never got to see like all of those other mystical, hidden places.

The Xanadu project got its start in the 1960s at around the same time the mouse and what we might recognize as a modern computer user interface were created. At its core was hypertext with the ability to link not just other pages but references and files together into one network. It also had version control, rights management, bi-directional links, and a number of additional features that would be revolutionary even today. Another core feature was transclusion, a method for making sure that original authors were compensated when their work was linked. However, Xanadu was hampered by a number of issues including lack of funding, infighting among the project’s contributors, and the development of an almost cult-like devotion to the vision, not unlike some of today’s hype around generative AI. Surprisingly, despite these faults, the project received significant funding from Autodesk, but even with this support the project ultimately failed.

Instead of this robust, bi-directional web imagined as early as the 1960s, the Internet we know of today is the much simpler World Wide Web which has many features of Xanadu we recognize. Not only is it less complex to implement, it famously received institutional backing from CERN immediately rather than stagnating for decades. The article linked above contains a tremendous amount of detail around this story that’s worth checking out. For all its faults and lack of success, though, Xanadu is a interesting image of what the future of the past could have been like if just a few things had shaken out differently, and it will instead remain a mythical place like so many others.

3D Print Smoothing, With Lasers

As anyone who has used an FDM printer can tell you, it’s certainly not the magical replicator it’s often made out to be. The limitations of the platform are numerous — ranging from anisotropic material characteristics to visual imperfections in the parts. In an attempt to reduce the visual artifacts in 3D prints, [TenTech] affixed a small diode laser on a 3D printer.

Getting the 1.5 watt diode laser onto the printer was a simple matter of a bracket and attaching it to the control board as a fan. Tuning the actual application of the laser proved a little more challenging. While the layer lines did get smoothed, it also discolored the pink filament making the results somewhat unusable. Darker colored filaments seem to not have this issue and a dark blue is used for the rest of the video.

A half smoothed half unprocessed test printThe smoothing process begins at the end of a 3D print and uses non-planar printer movements to keep the laser at an ideal focusing distance. The results proved rather effective, giving a noticeably smoother and shiner quality than an unprocessed print. The smoothing works incredibly well on fine geometry which would be difficult or impossible to smooth out via traditional mechanical means. Some detail was lost with sharp corners getting rounded, but not nearly as much as [TenTech] feared.

For a final test, [TenTech] made two candle molds, one smoothed and one processed. The quality difference between the two resulting candles was minimal, with the smoothed one being perhaps even a little worse. However, a large amount of wax leaked into the 3D print infill in the unprocessed mold, with the processed mold showing no signs of leaking.

If you are looking for a bit safer of a 3D print post-processing technique, make sure to check out [Donal Papp]’s UV resin smoothing experiments!

Continue reading “3D Print Smoothing, With Lasers”

A photo of the front-panel with a bunch of lamps and knobs.

The Making Of A Minimalist Analog Drum Machine

Our hacker [Moritz Klein] shows us how to make a minimalist analog drum machine. If you want the gory details check out the video embedded below and there is a first class write-up available as a 78 page PDF manual too. Indeed it has been a while since we have seen a project which was this well documented.

A typical drum machine will have many buttons and LEDs and is usually implemented with a microcontroller. In this project [Moritz] eschews that complexity and comes up with an analog solution using a few integrated circuits, LEDs, and buttons.

The heart of the build are the integrated circuits which include two TL074 quad op amps, a TL072 dual op amp, a CD4520 binary counter, and eight CD4015 shift registers. Fifteen switches and buttons are used along with seven LEDs. And speaking of LEDs, our hacker [Moritz] seems to have an LED schematic symbol tattooed to his hand, and we don’t know about you, but this screams credibility to us! :)

Continue reading “The Making Of A Minimalist Analog Drum Machine”

Mesa Project Adds Code Comprehension Requirement After AI Slop Incident

Recently [Faith Ekstrand] announced on Mastodon that Mesa was updating its contributor guide. This follows a recent AI slop incident where someone submitted a massive patch to the Mesa project with the claim that this would improve performance ‘by a few percent’. The catch? The entire patch was generated by ChatGPT, with the submitter becoming somewhat irate when the very patient Mesa developers tried to explain that they’d happily look at the issue after the submitter had condensed the purported ‘improvement’ into a bite-sized patch.

The entire saga is summarized in a recent video by [Brodie Robertson] which highlights both how incredibly friendly the Mesa developers are, and how the use of ChatGPT and kin has made some people with zero programming skills apparently believe that they can now contribute code to OSS projects. Unsurprisingly, the Mesa developers were unable to disabuse this particular individual from that notion, but the diff to the Mesa contributor guide by [Timur Kristóf] should make abundantly clear that someone playing Telephone between a chatbot and OSS project developers is neither desirable nor helpful.

That said, [Brodie] also highlights a recent post by [Daniel Stenberg] of Curl fame, who thanked [Joshua Rogers] for contributing a massive list of potential issues that were found using ‘AI-assisted tools’, as detailed in this blog post by [Joshua]. An important point here is that these ‘AI tools’ are not LLM-based chatbots, but rather tweaked existing tools like static code analyzers with more smarts bolted on. They’re purpose-made tools that still require you to know what you’re doing, but they can be a real asset to a developer, and a heck of a lot more useful to a project like Curl than getting sent fake bug reports by a confabulating chatbot as has happened previously.

Continue reading “Mesa Project Adds Code Comprehension Requirement After AI Slop Incident”

Electric Surfboard Gets Thrust Vectoring Upgrade

The internet has already taught us that an electric surfboard is a great way to get around on the water while looking like an absolute badass. [RCLifeOn] is continuing to push the boat forward in this regard, however, adding thrust vectoring technology to his already-impressive build.

If you’re unfamiliar with the world of electric surfboards, the concept is relatively simple. Stick one or more electric ducted fan thrusters on the back, add some speed controllers, and power everything from a chunky bank of lithium-ion batteries. Throw in a wireless hand controller, and you’ve got one heck of a personal watercraft.

Traditionally, these craft are steered simply by leaning and twisting as a surfer would with a traditional board. However, more dynamic control is possible if you add a way to aim the thrust coming from the propulsion system. [RCLifeOn] achieved this by adding steerable nozzles behind the ducted fan thrusters, controlled with big hobby servos to handle the forces involved. The result is a more controllable electric surfboard that can seriously carve through the turns. Plus, it’s now effectively an RC boat all on its own, as it no longer needs a rider on board to steer.

We’ve covered various developments in this surfboard’s history before, too. Video after the break. Continue reading “Electric Surfboard Gets Thrust Vectoring Upgrade”

Segger’s Awkward USB-C Issue With The J-Link Compact Debugger

Theoretically USB-C is a pretty nifty connector, but the reality is that it mostly provides many exciting new ways to make your device not work as expected. With the gory details covered by [Alvaro], the latest to join the party is Segger, with its J-Link BASE Compact MCU debugger displaying the same behavior which we saw back when the Raspberry Pi 4 was released in 2019. Back then so-called e-marked USB-C cables failed to power the SBC, much like how this particular J-Link unit refuses to power up when connected using one of those special USB-C cables.

We covered the issue in great detail back then, discussing how the CC1 and CC1 connections need to be wired up correctly with appropriate resistors in order for the USB-C supply – like a host PC – to provide power to the device. As [Alvaro] discovered through some investigation, this unit made basically the same mistake as the RPi 4B SBC before the corrected design. This involves wiring CC1 and CC2 together and as a result seeing the same <1 kOhm resistance on the active CC line, meaning that to the host device you just hooked up a USB-C audio dongle, which obviously shouldn’t be supplied with power.

Although it’s not easy to tell when this particular J-Link device was produced, the PCB notes its revision as v12.1, so presumably it’s not the first rodeo for this general design, and the product page already shows a different label than for the device that [Alvaro] has. It’s possible that it originally was sloppily converted from a previous micro-USB-powered design where CC lines do not exist and things Just Work™, but it’s still a pretty major oversight from what should be a reputable brand selling a device that costs €400 + VAT, rather than a reputable brand selling a <$100 SBC.

For any in the audience who have one of these USB-C-powered debuggers, does yours work with e-marked cables, and what is the revision and/or purchase date?