The outer shell was created by first starting with a 3D printed heart shape. This was used as a form upon which the brass wire could be soldered together to form an attractive heart-shaped cage. Inside, an Arduino Nano is hooked up to a series of WS2812b LEDs. The LEDs are flashed in time with the heartbeat of the person holding the heart, thanks to a MAX30102 heartbeat sensor. There’s also a TP4056 charge module and a small lithium battery to provide power for the device.
Adding the heartbeat sensor really makes this project shine, forming a connection between the holder and the device itself. The tasteful craftsmanship of the brass design makes this an excellent gift, one we’re sure anyone would like to receive. We’ve seen [Jiří Praus] make the most of this artform before too, with projects like this stunning tulip or dead-bug Arduino. Video after the break. Continue reading “LED Heart Beats With The Beholder”→
Drop what you’re doing and get thee to thy workshop. This is the last weekend of the Hackaday Circuit Sculpture Contest, the perfect chance for you to exercise the creative hacker within by building something artistic using stuff you already have on hand.
The concept is simple: build a sculpture where the electronic circuit is the sculpture. Wire the components up in a way that shows off that wiring, and uses it as the structure of the art piece. Seven top finishers will win prizes, but really we want to see everyone give this a try because the results are so cool! Need proof? Check out all the entries, then ooh and ah over a few we’ve picked out below. You have until this Tuesday at noon Pacific time to get in the game.
wirez80 by Matseng
555 Spider by Sunny
Freeform RGB Atari Punk Console by Emily Valesco
These are just three awesome examples of the different styles we’ve seen so far in the contest. Who needs a circuit board for a retro computer? Most people… but apparently not [Matseng] as this Z80 computer is freformed yet still interactive.
Really there can’t be many things more horrifying than the thought of spider robots, but somehow [Sunny] has taken away all of our fears. The 555 spider project takes “dead bug” to a whole new level. We love the angles in the legs, and the four SMD LEDs as spider eyes really finish the look of the tiny beast.
[Huaishu Peng] and a group of other researchers have come up with a system that allows them to use virtual reality (VR) to model an object in a space in front of them while a robot simultaneously 3D prints that object in that same space, a truly collaborative effort they call the RoMA: Robotic Modelling Assistant. This is a step toward fixing the problem of designing something and then having to wait for the prototype to be made before knowing how well it fits the design goals.
How does the designer/robot collaboration work? The designer wears an Oculus Rift VR headset with a camera mounted to the front, turning it into an AR (Augmented Reality) headset. In front of the designer is a rotating platform on which the object will be 3D printed. And on the other side of the platform is the 3D printing robot. In the AR headset, the designer views the platform, the object, and the robot as seen by the camera but with the model he’s working on overlayed onto the object. An AR hand controller allows him to work on the model. Meanwhile, the robot 3D prints the model. See it in action in the video below.
3D printing is supposed to be about rapid prototyping. Design, print, use, re-design, print, test — iterate until happy. But when you’re laying down filament at 60 mm/s, it can seem anything but rapid.
[Huaishu Peng], [Rundong Wu], and their supervisors at Cornell have come up with a 3D printer that can print almost as fast as you can model, and is able to add and subtract from the model on the fly. The goal is to get an initial model out so quickly that designing and printing can be truly interactive. They look to have succeeded — check out the video below.
3ders.org has a brilliant writeup of the machine that you should also go read once the video’s magic has worn off. There’s a lot going on to make this all work. The printer adds two extra degrees of freedom and a cutter head so that it can make additions and subtractions from the side, and is not constrained to layer-by-layer construction. To get the ABS to cool fast enough to make solid strands, water jets mist it down to temperature just after it’s printed.
3D printers may be old news to most of us, but that’s not stopping creative individuals from finding new ways to improve on the technology. Your average consumer budget 3D printer uses an extrusion technology, whereby plastic is melted and extruded onto a platform. The printer draws a single two-dimensional image of the print and then moves up layer by layer. It’s an effective and inexpensive method for turning a computer design into a physical object. Unfortunately, it’s also very slow.
That’s why Hasso Plattner Institute and Cornell University teamed up to develop WirePrint. WirePrint can slice your three-dimensional model into a wire frame version that is capable of being printed on an extrusion printer. You won’t end up with a strong final product, but WirePrint will help you get a feel for the overall size and shape of your print. The best part is it will do it in a fraction of the time it would take to print the actual object.
This is a similar idea to reducing the amount of fill that your print has, only WirePrint takes it a step further. The software tells your printer to extrude plastic in vertical lines, then pauses for just enough time for it to cool and harden in that vertical position. The result is much cleaner than if this same wire frame model were printed layer by layer. It also requires less overall movement of the print head and is therefore faster.
[Francois] over at 1024 Architecture has been working on a project we think you’ll be likely to see in a professional music video before too long. Using his Kinect sensor, ha has been tracking skeletal movements, adding special effects to the resulting wire frame with Quartz Composer. While this idea isn’t new, the next part is. He takes the QC tweaked video stream and then projects it back over the performer using MadMapper to match the video to the body movements, recording the resultant display.
The project started out with a few hiccups, including a noticeable delay between the body tracking and the display. It caused the performer to have to move more slowly than he would like, so things had to be tweaked. [Francois] first tested the latency between his computer and the projector by displaying a timecode representation on the screen as well as via the projector. He found the projector to have a latency of 1 frame at 60 fps, which wasn’t too bad. This led him to believe the culprit was his Kinect, and he was right. There was a 6 frame delay, so he locked the video output to 30 fps in hopes of cutting that delay in half.
The effect is slightly reminiscent of Tron, but with more distortion. We can’t wait to see more projects similar to this one in the future.
The resulting video embedded below is pretty cool in our opinion, but you can judge for yourself.