Haddington Dynamics, the company behind the Dexter robot arm that won the 2018 Hackaday Prize, has opened its first microfactory to build robot arms for Australia and Southeast Asia.
You may remember that the combination of Dexter’s makeup and capabilities are what let it stand out among robotics projects. The fully-articulated robot arm can be motion trained; it records how you move the arm and can play back with high precision rather than needing to be taught with code. The high-precision is thanks to a clever encoder makeup that leverages the power of FPGAs to amplify the granularity of its optical encodes. And it embraces advanced manufacturing to combine 3D printed and glue-up parts with mass produced gears, belts, bearings, and motors.
It’s a versatile robot arm, for a fraction of the cost of what came before it, with immense potential for customization. And did I mention that it’s open source?
What is a Micro-Factory?
Simply put, a micro-factory is a recreation of the tools and skills the Haddington Dynamics team have at their current headquarters in Las Vegas. With four production stations in their office, each capable of building two robots at a time, the team can build 30 robots a month.
The micro-factory model licenses this technology. Those who will staff the new locations come to headquarters for training in the process of building a robot arm. They take this knowledge back and use the same tools and materials to begin producing robots at their own location.
The first micro-factory produced robot arm was complete on February 24, 2020 in Toowoomba, Australia, about 120 kilometers (75 miles) inland from Brisbane. There are currently two people working at the new DCISIV Technologies location and their goal in the near term is to reach a capacity of 32 robots per month with a workforce of four people.
Why Not Traditional Manufacturing?
So why not just set up shop in a traditional factory and start banging out robot arms as fast as you can stack them on pallets?
First off, total sales volume isn’t quite there yet. But with about 300 robots now in the wild across over 20 different countries it’s not hard to see that they do need to step up production abilities. The immediate factors driving their assembly methods are the complexity of the parts and their desire for the ability to customize and improve the design without the headache of factory retooling.
The current design includes parts that would be difficult or impossible to injection mold, as well as other made of materials like carbon fiber. So they use 3D printing for many of their parts, ranging in material from PETG and nylon, to a combination of carbon fiber reinforced nylon (called Onyx Filament) to continuous carbon fiber filament. For multiple-material parts, molds are used to align everything during the glue-up process. The molds and the 3D printers themselves are all part of the equipment specified for the micro-factory model. You can check out the intense manufacturing process in the assembly image gallery or from this eleven-part video series.
There are some very interesting things on the horizon. At launch, each of these micro-factories is set up to build robot arms. But there’s no reason they couldn’t be used to make something else. Haddington Dynamics wants to see robots building robots. This first iteration is 3D printers, which we suppose are a type of robot, building robot arms. The next iteration could be robot arms doing the building.
It’s also worth noting that this model untangles complex supply chains. The majority of parts are manufactured on-site. The remainder are common goods like threaded rod and fasteners. Even the bearings come from common mass-produced items like motorcycle hub bearings and inline-skate wheel bearings. If there was ever a shortage of these components it’s entirely possible to quickly redesign the manufactured parts to suit a replacement part.
What Does Dexter Have in Its Future?
As mentioned before, one of key features of Dexter is how easily it is to train the robot arm. The video below shows two arms being trained to pull a pint of beer from a tap. One uses a special end effector to pick up the cup while the other operates the tap.
It’s a great way to show off what the robot can do, but real-world applications are not far off from this type of skill set. The team tells me they’re working on a routine that uses Dexter for gong meditation. For those that are not able to connect in person with a human instructor, the robot arm can precisely record and playback the motions used.
They’re also working with NASA to establish a commercial drone certification protocol. Delivery companies are chomping at the bit to get automated drone delivery in place, with the FAA trying to keep up in establishing safety regulations. Large drones need to have their airframes inspected every 100 hours of flight and any drone that flies beyond line of sight — the purpose of a delivery drone — needs to have the integrity of its electronics recertified for every flight. This is labor intensive for what is meant to be an autonomous delivery system, and Dexter is being tasked at doing the automated drone inspection process for both airframe and electronics through NASA’s Fit2Fly program (PDF).
Dexter is being used in a university for stem cell research, and robotics students at Duke University have recently done the work to use Dexter in conjunction with Robot Operating System. Two of the robots are headed to Lawrence Livermore National Laboratory, although the team is not sure what their role there will be. And back on the manufacturing automation front, Haddington Dynamics is working on a protocol to use the robot arms to build cable harnesses, a dexterous job that is often done by humans and when it is automated the machines are prohibitively expensive.
The future is automation, and Dexter makes that future look like a pretty good tomorrow.
32 thoughts on “Dexter Robot Arm Embraces New Manufacturing With First Micro-Factory”
Thanks for post, keen to see comments on augmentations and variations whether practical or otherwise :-)
It another Perth guy! Apparently there is a local mob that are reselling these which is pretty cool.
They had a webinar with Markforged the printer manufacturer that was quite interesting.
Ah ha, you the RF guy from WAIT (Now Curtin) or UWA back in fox hunting days rotating multi element Yagi’s on car roofs ? ;-)
Hmm, on reflection that was maybe Adrian Van Den a Voort…
Hey AOG starts tomorrow and entrance free, though too hot for me, so going Thursday cooler. Browsing your project links, might borrow your website template tis pretty good, my own site well overdue for update, cheers
“The future is automation, and Dexter makes that future look like a pretty good tomorrow.”
Tagline that should go with a movie starring Arnold Schwarzenegger.
“Come with me if you want to automate”
Right, hasta la vista your re-pet to mars.
It would need machine guns and grenade launchers for when all inevitably goes wrong and the machines need fixing.
A nice but unfortunately very expensive robot.
and the latest app is a gong meditation machine. They need a marketing guru outside academia. It looks great, but touting that is a turn off.
“The fully-articulated robot arm can be motion trained; it records how you move the arm and can play back with high precision rather than needed to be taught with code. ”
This is the key to a lot of the market – the “Cobots” (collaborative robots) that are being implemented in various corners of manufacturing are being used *because* you (mostly) don’t need a field engineer on station to make it all go. Move the arm, tap the tablet, and you’re done.
I would sincerely hope that this article’s platform succeeds, and opens others’ eyes to the possibility of add-ons. In my experience Universal Robots have done very well with plug-and-play vision systems from third parties. Here’s the general idea (I have no personal involvement with any of these, but I see them in use).
‘Trained’ motion is a relatively standard method of industrial automation. It has the very obvious limitation that unless everything is in exactly the correct pose as the motion expects, things go very wrong very fast (i.e. “that robot just shoved a chassis right through another chassis” bad).
The key for collaborative robots is that they do not just perform a fixed motion sequence: they can be bumped and recover (and bumping them disrupts the robot but does not injure the human, something industrial robots are terrifyingly bad at), they can actively avoid people in their reach zone, and can tolerate objects being in the wrong position or pose without retraining.
That video does not show a complete cycle from the robot picking the cup up to the robot delivering the full cup. There are fades and cuts all over the place. This leads me to the logical conclusion that it didn’t work and they had to fix it in video editing.
“Any sufficiently advanced technology is indistinguishable from a rigged demo.”
James Klass, or maybe even Einstein, Abraham Lincoln or Aristotle, got it off teh interwebs after all.
Here is the b roll for the video
Here are a couple of other uncut demo videos:
One from the same bar a few nights later when we actually had the robot bartend:
And one of the robot pouring soup:
Here are a couple other uncut demo videos:
Dexter pouring beer in glass cups at the same bar a few days later:
Dexter pouring soup:
Having worked pretty extensively with 6 axis arms for my day job, the cobot style of move the arm by hand (not keypad) and record a point never works as well as you think it does. Something about having to use the motors, brakes, and force sensors to physically position the arm is jerky at best. Almost anyone I think, within a week will be more accustomed to using the keypad to move the arm. The reference frames are very good (X,Y,Z, User, Tool etc) and offer much more accuracy than hand moving. The simulators are equally good now so you can build the program offline then only have to tweak it once you get it running in the manufacturing space.
That’s true then the arm moves by sensing external forces via sensors between the joint and the load because this method introduces errors between the actual position during recording, and the recorded position. Dexter doesn’t do that. It senses force behind the joint, and position (not force) at the joint via the encoder so there is zero error between the recorded and actual positions. Playback is dead nuts exact and smooth, accurately capturing the dexterity of human motion. That is, in fact, where the name “Dexter” came from.
Please take the time to read up about the encoders on the Dexter Wiki:
Super rad encoding solution. Curious to how the variations in the slots let you do absolute encoding? Assume the 3D printing artefacts are used as some sort of fingerprint? How well does it work?
When the robot is assembled, we do a calibration where each joint is run through its full range of motion, using momentum to smooth that out and provide a near perfect “scan” of the output from the encoder. We record that output at ultra high speed via the FPGA and RAM. This becomes the standard against which we then measure the output and convert it back into position. So any variations in the slots are “baked in” to the system and get baked right back out during use.
It’s hard to describe how well it works, but we get 20 micron or better precision at the end of 5 axis. You work with robots, you know how good that is.
Sounds like something that could be useful in a FDM 3D printer.
I do know how good that is. It’s pretty damn good. Especially out a the end effector. Given the cycle speed of the encoder/fpga combo are you able to sense texture? Something like this would be super handy in the meat processing space.
I can see these arms being control though internet by a person living in a cheap labor country, such as China or India.
A kind of……….. Tele-sourcing®.
Working on that. Easier said than done as it turns out. We could use some help.
Have a job that can’t be done remotely, and so must expose yourself or your coworkers to viruses? Here is a solution in the not too distant future. Also, the ability to reach out and use expensive machines at makerspaces in other locations, then package and ship the mfgr’d part to yourself. Or watch an event from a camera angle you control remotely. Shake hands with a friend, touch a lover, care for a family member from far away… Fix a broken machine for a customer, provide remote support, pop into a vending machine wave at the customers from the other side of the world… many uses.
Not to mention the obvious obvious killer app… send them to people you don’t like so you can slap them over the internet.
Afraid I had to send mine’s back. Something about being worn out.
Too serious and industrial not hobby like.
Back to hobby – It can be used to trained a wing Chun … two hand but need a feet – it would be the fifth level combined sticky hand and level 4 wooden dummy.
better show me cut in wood a statues
The movement reminds me of Baxter. Can be jerky with what I assume is due to recording exact movements. I wonder if they will adding smoothing to the movements in the future.
‘This is the key to a lot of the market – the “Cobots” (collaborative robots) that are being implemented in various corners of manufacturing are being used *because* you (mostly) don’t need a field engineer on station to make it all go. Move the arm, tap the tablet, and you’re done.’ cit.
You do not need a ‘field engineer’ to program a Industrial robot since, well the last 30 years. Programing some movement is easy, always was. Now it’s just more colors on the screen ^^.
But the robot is not a stand alone machine. You need to implement it into a bigger machine
(integration / communication). And you have mostly other applications then serving beverage.
And this is when you need an expert.
Btw., the machine in the video does not hold its TCP. Either the coordinate system is not ( or wrong ) applied
( -> see: integration ) or he has choosen the wrong tool, or the tool is not correct measured, or, or, or.
Moving the robot around, by hand or by tablet, that is the easy part. For the other stuff, you NEED some ( or more) knowledge.
This was a reply to [Thinkerer] but Alas, the reply works just fine…
Please be kind and respectful to help make the comments section excellent. (Comment Policy)