Selective Metal Sintering is cool but slow. Fear not, a technology that was initially developed to smooth and pattern laser beams is here to save the day, according to a new paper by Lawrence Livermore researchers.
In a paper titled “Diode-based additive manufacturing of metals using an optically-addressable light valve,” the researchers lay out a procedure for using an array of high-powered laser diodes among other things to print a whole layer of metal from powdered metal at one time. No more forward and backward, left and right. Just one bright flash and you’re done. Naturally, the technology is still in its infancy, but huge 3D printed metallic parts are something we’ve always hoped for.
According to [Matthews], the first author of the paper, the mojo of the process comes from a customized laser modulator: the Optically Addressable Light Valve which functions similarly to liquid crystal-based projectors but can handle the high energies associated with powerful lasers. There’s more information straight from the paper’s authors in this phys.org interview.
While it’s true that now is the time for direct metal 3D printing, it appears that for the time being the average hacker is stuck with alternative methods for printing metal. While it’s not the same, pewter casting with PLA might suffice.
Thanks to [Kevin] for sending this in!
The interesting thing about submissions for The Hackaday Prize is seeing unusual projects and concepts that might not otherwise pop up. [ken conrad] has a curious but thoughtfully designed idea for Raspberry Pi-based SmartZoom Imaging that uses a Pi Zero and camera plus some laser emitters to create a device with a very specific capability: a camera that constantly and dynamically resizes the image make the subject appear consistently framed and sized, regardless of its distance from the lens. The idea brings together two separate functions: rangefinding and automated zooming and re-sampling of the camera image.
The Raspberry Pi uses the camera board plus some forward-pointing laser dots as a rangefinder; as long as at least two laser dots are visible on the subject, the distance between the device and the subject can be calculated. The Pi then uses the knowledge of how near or far the subject is to present a final image whose zoom level has been adjusted to match (and offset) the range of the subject from the camera, in effect canceling out the way an object appears larger or smaller based on distance.
We’ve seen visible laser dots as the basis of rangefinding before, but never tied into a zoom function. Doubtlessly, [ken conrad] will update his project with some example applications, but in the meantime we’re left wondering: is there a concrete, practical use case for this unusual device? We have no idea, but we’d certainly have fun trying to find one.
We’re suckers for any project that’s nicely packaged, but an added bonus is when most of the components can be sourced cheaply and locally. Such is the case for this little laser light show, housed in electrical boxes from the local home center and built with stuff you probably have in your junk bin.
When we first came across [replayreb]’s write-up and saw that he used hard drives in its construction, we assumed he used head galvanometers to drive the mirrors. As it turns out, he used that approach in an earlier project, but this time around, the hard drive only donated its platters for use as low mass, first surface mirrors. And rather than driving the mirrors with galvos, he chose plain old brushed DC motors. These have the significant advantage of being cheap and a perfect fit for 3/4″ EMT set-screw connectors, designed to connect thin-wall conduit, also known as electromechanical tubing, to electrical boxes and panels. The motors are mounted to the back and side of the box so their axes are 90° from each other, and the mirrors are constrained by small cable ties and set at 45°. The motors are driven directly by the left and right channels of a small audio amp, wiggling enough to create a decent light show from the laser module.
We especially like the fact that these boxes are cheap enough that you can build three with different color lasers. In that case, an obvious next step would be bandpass filters to split the signal into bass, midrange, and treble for that retro-modern light organ effect. Or maybe figuring out what audio signals you’d need to make this box into a laser sky display would be a good idea too.
Continue reading “Little Laser Light Show is Cleverly Packaged, Cheap to Build”
Self-driving cars are, apparently, the next big thing. This thought is predicated on advancements in machine vision and cheaper, better sensors. For the machine vision part of the equation, Nvidia, Intel, and Google are putting out some interesting bits of hardware. The sensors, though? We’re going to need LIDAR, better distance sensors, more capable CAN bus dongles, and the equipment to tie it all together.
This is the cheapest LIDAR we’ve ever seen. The RPLIDAR is a new product from Seeed Studios, and it’s an affordable LIDAR for everyone. $400 USD gets you one module, and bizarrely $358 USD gets you two modules. Don’t ask questions — this price point was unheard of a mere five years ago.
Basically, this LIDAR unit is a spinning module connected to a motor via a belt. A laser range finder is hidden in the spinny bits and connected to a UART and USB interface through a slip ring. Mount this LIDAR unit on a robot, apply power, and the spinny bit does its thing at about 400-500 RPM. The tata that comes out includes distance (in millimeters), bearing (in units of degrees), quality of the measurement, and a start flag once every time the head makes a revolution. If you’ve never converted polar to cartesian coordinates, this is a great place to start.
Although self-driving cars and selfie drones are the future, this part is probably unsuitable for any project with sufficient mass or velocity. The scanning range of this LIDAR is only about 6 meters and insufficient for retrofitting a Toyota Camry with artificial intelligence. That said, this is a cheap LIDAR that opens the door to a lot of experimentation ranging from small robots to recreating that one Radiohead video.
Most of us have Ethernet in our homes today. The real backbones of the Internet though, use no wires at all. Optical fibers carry pulses of light across the land, under the sea, and if you’re lucky, right to your door. [Sven Brauch] decided to create an optical link. He didn’t have any fiber handy, but air will carry laser pulses over short distances quite nicely. The idea of this project is to directly convert ethernet signals to light pulses. For simplicity’s sake, [Sven] limited the bandwidth to one channel, full-duplex, at 10 Megabits per second (Mbps).
The transmit side of the circuit is rather simple. An op-amp circuit acts as a constant current source, biasing the laser diode. The transmit signal from an Ethernet cable is then added in as modulation. This ensures the laser glows brightly for a 1 bit but never shuts completely off for a 0 bit.
The receive side of the circuit starts with a photodiode. The diode is biased up around 35 V, and a transimpedance amplifier (a current to voltage converter) is used to determine if the diode is seeing a 1 or a 0 from the laser. A bit more signal conditioning ensures the output will be a proper differential Ethernet signal.
[Sven] built two identical boards – each with a transmitter and receiver. He tested the circuit by pointing it at a mirror. His Linux box immediately established a link and was reported that there was a duplicate IP address on the network. This was exactly what [Sven] expected. The computer was confused by its own reflection – but the laser and photodiode circuits were working.
Finally, [Sven] connected his PC and a Raspberry Pi to the two circuits. After carefully aligning the lasers on a wooden board, the two machines established a link. Success! (But be aware that a longer distances, more sophisticated alignment mechanisms may be in order.)
Want to know more about fiber and networking? Check out this article about wiring up an older city. You can also use an optical link to control your CNC.
Photochromic paint is pretty nifty – under exposure to light of the right wavelength, it’ll change colour. This gives it all kinds of applications for temporary displays. [Jiri Zemanek] decided to apply photochromic paint to an egg, utilising it to create stroboscopic patterns with the help of a laser.
Patterns for the egg are generated in MATLAB. A Discovery STM32 board acts as a controller, looking after the laser scanner and a stepper motor which rotates the egg. A phototransistor is used to sync the position of the laser and the egg as it rotates.
The photochromic paint used in this project is activated by UV light. To energize the paint, [Jiri] harvested a violet laser from a Blu-ray player, fitting it to a scanning assembly from a laser printer. Instead of scanning the laser across an imaging drum, it is instead scanned vertically on a rotating egg. Patterns can then be drawn on the egg, which fade over time as the paint gives up its stored energy.
[Jiri] exploits this by writing a variety of patterns onto the egg, which then animate in a manner similar to a zoetrope – when visualised under strobing light, the patterns appear to move. There are also a few holiday messages shown for Easter, making the egg all the more appropriate as a billboard.
If you like the idea of drawing on eggs but are put off by their non-uniform geometry, check out the Egg-bot. Video below the break.
Continue reading “Photochromic Eggs: Not for Breakfast”
How do you like your Ham and Cheese sandwich? If you answered “I prefer it beefy”, look no further than [William Osman]’s Vin Diesel Ham and Cheese Sandwich! [Osman]’s blog tagline is “There’s science to do” but he is the first to admit this is science gone too far. When one of his followers, [Restroom Sounds], commented “Please sculpt a bust of [Vin Diesel] using laser cut cross-sections of laser sliced ham”, he just had to do it.
His friend [CameraManJohn] modeled the bust using Maya and [Osman] has provided links to download the files in case there’s the remote possibility that someone else wants to try this out. They picked the cheapest packs of sliced ham they could get from the supermarket — so technically, they did not actually laser slice the ham. For help with generating the slice outlines, they found the Slicer app for Autodesk’s Fusion 360 which did exactly what needed to be done. The app converts the 3D model into individual cross sections, similar to an MRI. It helps to measure the thickness of various samples of your raw material so that the Slicer output is not too stretched (or squished). The result is a set of numbered 2D drawings that can be sent to your laser cutter.
The rest of the video scores pretty high on the gross-o-meter, as [Osman] goes about laser cutting slices of ham (and a few slices of cheese), tasting laser cut ham (for Science, of course), and trying to prevent his computer from getting messed up. In the end, the sandwich actually turns out looking quite nice, although we will not comment on its taste. A pair of googly eyes adds character to the bust.
One problem is that the Slicer app does not optimise its results for efficient packing. with the smallest part occupying the same bounding box as the largest. This leads to a lot of wasted pieces of ham slices to be thrown away. [Bill] is still wondering what to do with his awesome sandwich, so if you have suggestions, chime in with your comments after you’ve seen the video linked below. If you know [Vin Diesel], let him know.
This isn’t [Osman]’s first adventure with crazy food hacks — here are a few tasty examples: a Toast-Bot that Butters For You (sometimes), a Laser-Cut Gingerbread Trailer Home, and a Pumpkin-Skinned BattleBot.
Continue reading “Sudo Make Me a Sandwich”