We’re on the ground here at the 6th Annual World Maker Faire in Queens, New York. This year the Faire is even bigger, extending out from the New York Hall of Science towards Flushing Meadows Corona Park.
Just inside the gates, we ran into [Tommy Mintz] who was showing off his improved portable automated digital photo collage (IP-ADC). Tommy has connected a Raspberry Pi and its camera on a long extension cable. The camera resides high up on a monopod, giving it a bird’s eye view of the area. The Pi first captures a background image, then grabs shots of things that change within the scene. The resulting collages range from hilarious to the surreal. The entire system is mounted on a cart and powered by batteries.
[Tommy] hooked us up with the WiFi password, so we were able to download photos directly from the IP-ADC. Less technologically inclined folks would be able to grab physical prints from the on-board Epson photo printer.
We’ll be reporting what we see here at the faire, so drop us a tweet @hackaday if you want us to stop by!
The 5th annual Kansas City Maker Faire was as fun as ever, but it definitely felt different from previous years. There seemed to be an unofficial emphasis on crafts this year, and I mean this in the broadest sense of the word. There was more exposure for the event in the local media, and this attracted a wider variety of faire-goers. But the exposure also brought more corporate sponsorship. This wasn’t an exclusively bad thing, though. For instance, several people from Kansas City-based construction firm JE Dunn were guiding mini makers through a birdhouse build.
Many of the this year’s booths were focused on a particular handicraft. A local music shop that makes custom brass and woodwind instruments had material from various stages of the building process on display. Several tables away, a man sat making chainmaille bags. At one booth, a girl was teaching people how to fold origami cranes. Several makers had various geek culture accessories for sale, like a shoulder bag made from a vintage Voltron sweatshirt. The guys from SeeMeCNC made the 12-hour drive with the Part Daddy, their 17-foot tall delta printer. They printed up a cool one-piece chair on Saturday, then made a child-sized version of it on Sunday.
The entire lower level of the venue was devoted to a series of exhibits related to the film and television industry. Collectively, they covered the entire production process from the casting call to the red carpet. Several local prop and costume makers were showing off their fantastic creations, including [Steven] of SKS Props. He started making video game props for fun a few years ago. These days, his work adorns the offices of some of those same game companies.
Of course, there was plenty to see and do outside, too. All the kids playing human foosball were having a blast. LARPers larped next to lowriders and food trucks, power wheels raced, and a good time was had by all.
Tiny chair printed on the Part Daddy by SeeMeCNC
This is clearly an exceptional R2 unit.
A husband and wife team make arty robots using antique cameras, test equipment, and tins.
A luthier explains his process to an onlooker..
Some of the well-detailed costumes on display.
Masks by SKS Props.
JE Dunn hammered the weekend away helping kids make birdhouses.
Nerdly housewares were plentiful this year.
A peek at the film and television production process.
There is always a great variety of things to see and experience at the Kansas City Maker Faire. This is the fifth year for the event which is held at historic Union Station, a beautiful art deco train depot from a bygone era. With a multitude of booths and exhibits across two floors and a vast outdoor area, there is something for pretty much everyone. Often times, the interesting things are mobile conversation-starting creations. When we saw [Dan] walking around with a giant wooden contraption on his arm, we knew we must find out more about it.
The impetus for [Dan]’s project was his desire to pick up a soda can using a mechanical grip. He now believes this to have been a lofty goal, given the weight of a full can of the stuff. This prosthetic hand is made from wooden finger segments that are connected by small, continuous hinges. Each of [Dan]’s gloved fingers curls around a metal ring to control that digit’s large wooden analog. On the inside of the hand, sections of paracord run underneath strategically placed eye bolts on each finger segment and are tied off at the fingertips. A second set of eye bolts on the back of the hand anchor the network of rubber bands that provide resistance. Although he made it look easy to open and close the hand, [Dan] said that it’s pretty heavy to lug around and somewhat strenuous to use. Next time, he’ll probably go with foam or 3D-printed pieces.
For a few years now I’ve been developing an interactive army of delta robots. This ongoing project is fueled by my desire to control many mechanical extremities like an extension of my body (I’m assuming I’m not the only one who fantasizes about robots here).
Since my army doesn’t have a practical application… other than producing pretty light patterns and making the user feel extremely cool for a minute, I guess you’d call it art. In the past I’ve held a Kickstarter to fund the production of my art which I can now happily show at cool events with interesting people; Maker Faire being one of them.
Interactivity and Sprawling Crowds
Last year, for our debut at the big Bay Area Maker Faire, my collaborator, [Mark], and I displayed a smaller sampling of 30 robots for our installation. We also decided to create an interactive aspect for others to experience. After the end of our crowdfunding period last March, we had a little over a month to do any development before the big event, so our options were slim. The easy solution was to jam our delta code into the hand tracking demo which comes with the Xbox Kinect’s Open NI within Processing. This was cool enough to exhibit, but we hadn’t really anticipated how it would go over in an environment as densely packed as the dark room at Maker Faire.
We should have known better. Both of us were aware that there would be many, many children… all with micro hands to confuse and bewilder the Kinect, but we did it anyway. Our only resolve was to implement the feature that would force the Kinect to track one hand at a time, only after being waved at in a very particular fashion. After needing to explain this stipulation to every person who stopped by our booth over the course of the weekend, we decided never to use the Kinect for crowds ever again; lesson learned.
Delta Robots and DMX
Over the past year since that experience, we’ve tripled the size of the installation and brainstormed some better demo ideas. As of now, the robots are all individually addressable over an RS485 bus, and we use the DMX protocol over a CAT5 cable to send commands. If you aren’t familiar with it, DMX is used in show production to control stage lighting… to which there is a super neat and free application called QLC+ that allows you to effectively orchestrate the motion and color of many individual light units; perfect for our cause.
Functionally, each of the 84 delta robots in the installation believes that it is a stage light (robots with identity issues). We mapped the X and Y axis of the end effector to the existing pan and tilt values, and the z axis to the beam focus value. The RGB of the LED mounted in the end effector of each delta maps directly to the RGB value of the stage light.
By using the sliders in the QLC+ GUI, I could select groups of robots and create presets for position and color. This was great, someone like me who doesn’t really write a lot of code could whip up impressive choreography with little sweat. Additionally, the program comes with a nice visualizer, where you can layout virtual nodes and view your effects as you develop them.
This is the layout of our installation mapped in QLC+. The teal and purple sliders around each light represent pan and tilt (or in our case X and Y):
Lighting control was an interesting solution. Having autonomous robots this year changed how people responded to them, as they were less like an army you’d command and more of a hypnotic field of glowing grass.
[Mark] and I are considering picking up some flex sensors and maybe playing with the Leap or an EEG headset as a means to reintroduce the interactive aspect. Bottom line, I have this cool new toy that I can’t wait to play with over the summer!
Of course Maker Faire was loaded up with 3D printers, but we’re no longer in the era of a 3D printer in every single booth. Filament-based printers are passé, but that doesn’t mean there’s no new technology to demonstrate. This year, it was stereolithography and other resin-based printers. Here’s the roundup of each and every one displayed at the faire, and the reason it’s still not prime time for resin-based printers.
Of course the Formlabs Form 1+ was presented at the Bay Area Maker Faire. They were one of the first SLA printers on the market, and they’ve jumped through enough legal hoops to be able to call themselves the current kings of low-cost laser and resin printing. There were a few new companies and products at the Faire vying for the top spot, and this is where things get interesting.
The folks at Formlabs displayed the only functional print of all the resin-based 3D printing companies – a tiny, tiny Philco Predicta stuffed with an LCD displaying composite video. The display is covered by a 3D printed lens/window. That’s the closest you’re going to get to an optically clear 3D printed part at the Faire.
XYZPrinting, the company famous for the $500 printer that follows the Gillette model: sell the printer cheap, sell expensive replacement filament cartridges, and laugh all the way to the bank. Resetting the DRM on the XYZPrinting Da Vinci printer is easy, the proprietary host software is done away with, and bricked devices are not. Time for a new market, huh?
Enter the XYZPrinting Nobel, a resin printer that uses lasers to solidify parts 25 microns at a time. The build volume is 125x125x200mm (5x5x7.9″), with an X and Y resolution of 300 microns. Everything prints out just as you would expect. As far as laser resin printers go, it’s incredibly cheap: $1500. It does, however, use XYZware, the proprietary toolchain forced upon Da Vinci users, although the Nobel is a stand-alone printer that can pull a .STL file from a USB drive and turn it into an object without a computer. There was no mention of how – or if – this printer is locked down.
DWS Lab XFAB
You’ve seen the cheapest, now check out the most expensive. It’s the DWS Lab XFAB, an enormous and impressive machine that has incredible resolution, a huge build area, and when you take into account other resin printers, a price approaching insanity.
First, the price: $5000 officially, although I heard rumors of $6500 around the 3D printing tent. No, it’s not for sale yet – they’re still in beta testing. Compare that to the Formlabs Form 1+ at $3300, or the XYZPrinting Nobel at $1500, and you would expect this printer to be incredible. You would be right.
The minimum feature size of the XFAB is 80 microns, and can slice down to 10 microns. Compare that to the 300 micron feature size of the Form 1+ and Nobel, and even on paper, you can tell they really have something here. Looking at the sample prints, they do. These are simply the highest resolution 3D printed objects I’ve ever seen. The quality of the prints compares to the finest resin cast objects, machined plastic, or any other manufacturing process. If you’re looking for a printer for very, very high quality work, this is what you need.
Also on display – but not in the 3D printing booth, for some reason – was the Sharebot Voyager. Unlike all the printers described above, this is a DLP printer; instead of lasers and galvos, the Voyager uses an off-the-shelf 3D DLP projector to harden layers of resin.
Strangely, the Sharebot Voyager was stuck in either the Atmel or the Arduino.cc (the [Massimo] one) booth. The printing area is a bit small – 56x96x100mm, but the resolution – on paper, mind you – goes beyond what the most expensive laser and galvo printers can manage: 50 microns in the X and Y axes, 20 to 100 microns in the Z. Compare that figure to the XFAB’s 80 micron minimum feature size, and you begin to see the genius of using a DLP projector.
The Sharebot Voyager is fully controllable over the web thanks to a 1.5GHz quad core, 1GB RAM computer that I believe is running 32 bit Windows. Yes, the spec sheet said OS: 32 bit Windows.
There were no sample prints, no price, and no expected release date. It is, for all intents and purposes, vaporware. I’ve seen it, I’ve taken pictures of it, but I’ve done that for a lot of products that never made it to market.
The Problem With Resin Printers
Taking a gander over all the resin-based 3D printers, you start to pick up on a few common themes. All the software is proprietary, and there is no open source solution for either moving galvos, lasers, or displaying images on a DLP projector correctly to run a resin-based machine. Yes, you heard it here first: it’s the first time in history Open Source hardware folk are ahead of the Open Source software folk. Honestly, open source resin printer hosts is something that should have been done years ago.
This will change in just a few months. A scary, tattooed little bird told me there will soon be an open source solution to printing in resin by the Detroit Maker Faire. Then, finally, the deluge of resin.
A month before the Bay Area Maker Faire, there were ominous predictions the entire faire would be filled with BB-8 droids, the cute astromech ball bot we’ll be seeing more of when The Force Awakens this December. This prediction proved to be premature. There were plenty of R2 units droiding around the faire, but not a single BB-8. Perhaps at the NYC Maker Faire this September.
Regarding ball bots, we did have one friendly rolling companion at Maker Faire this year. It was a project by UC Davis students [Henjiu Kang], [Yi Lu], and [Yunan Song] that rolls around, seeking out whoever is wearing an infrared ankle strap. They team is calling it Project Naughty Ball, but we’re going to call it the first step towards a miniature BB-8 droid.
The design of the Naughty Ball is somewhat ingenious; it’s set up as a two-wheel balancing bot inside a clear plasic sphere. A ton of batteries work well enough as the ballast, stepper motors and machined plastic wheels balance and steer the ball bot, and the structure on the top hemisphere of the ball houses all the interesting electronics.
There is a BeagleBone Black with WiFi adapter, a few motor drivers, an IMU, and a very interesting 3D printed mount that spins the robot’s eyes – infrared cameras that spin around inside the ball and track whoever is wearing that IR transmitting ankle band.
As far as robotics project go, you really can’t do better at Maker Faire than a semi-autonomous ball bot that follows its owner, and the amount of work these guys have put into this project sends it to the next level. You can check out a video description of their project below.
With VR headsets, the problem isn’t putting two displays in front of the user’s eyes. The problem is determining where the user is looking quickly and accurately. IMUs and image processing techniques can be used with varying degrees of success, but to do it right, it needs to be really fast and really cheap.
[Alan] and [Valve]’s ‘Lighthouse’ tracking unit does this by placing a dozen or so IR photodiodes on the headset itself. On the tracking base station, IR lasers scan in the X and Y axes. By scanning these IR lasers across the VR headset, the angle of the headset to the base station can be computed in just a few cycles of a microcontroller. For a bunch of one cent photodiodes, absolute angles and the orientation to a base station can be determined very easily, something that has some pretty incredible applications for everything from VR to robotics.
Remember all of the position tracking hacks that came out as a result of the Nintendo Wii using IR beacons and a tracking camera? This seems like an evolutionary leap forward but in the same realm and can’t wait to see people hacking on this tech!