We usually envision small wheeled robots when we thing about swarm robotics but these cooperative quadcopters make us think again. This is an extension of the same project that produced those impressive aerial acrobatics. It may not be as flashy, but watching groups of the four-rotored flyers grab onto and lift loads is quite impressive. There is also a shot of one dropping a 2×4 and immediately compensating for the loss of weight. We’re not certain, but it looks like team lifting doesn’t require the 20 high-speed camera rig that the acrobatics did. We’ve embedded the demonstration video after the break.
[youtube=http://www.youtube.com/watch?v=YBsJwapanWI]
[Thanks Balbor]
Swamp robotics?? Should that be *swarm* robotics? lol
Well, that was honestly terrifying.
While an incredible feat of engineering, I can’t help but get an evil vibe off of their look and sound.
Manhacks anyone?
Posting alongside craig’s comment…
I usually think of airboats when I think of swamp robotics…
But it looks like they’re still using the cameras. Besides, I don’t think the robots have any form of sensors aside from limited inertial data.
Edit: The gripper is what really amazes me though… I’d love to see more of how that works.
Sweet, now when they take over, they can carry away the younglings.
this is incredible- i’ve been waiting years to see someone actually pick up and transport something with an RC vehicle like this.
very cool, and while not a hack, worthy of hack a day.
@ charper- agreed, I want to see how the gripper works.
That’s up in the list of most amazing yet still potentially useful things I’ve seen.
fyi: those red light sources are (I believe) the high-speed (mo-cap) cameras
Are the high speed cameras that you mention the cameras visible at 0:22? They definitely still have a whole rig of them set up, its how the motion tracking is done. The cameras are made by a company named Vicon, they have an IR strobe that bounces of off reflective dots that are on the shaft heading to each rotor. The computer uses multiple cameras to figure out the position of the dot in space. You have to have a lot of cameras in order to have a large area that you can move through. We actually have 8 of those same cameras in the lab I’m working in right now that we use to measure the motion of cadaver limbs. We have machines that move around knees and shoulders to mimic human movement.
*off of
@MS3FGX ,
I had exactly the same unsettling feeling about it the moment I watched the video….
Not to say that it’s not ridiculously awesome.
Wondering the total payload… ;)
A cluster of quadcopters for a grand total of 153 lbs would fit me and my camera.
Cool stuff! I look forward to seeing what they can do in a couple years! Along the lines of Charpers comment I would LOVE to see these individually as fully self contained and ‘aware’ processors. If they could get a swarm of these (craigs right!)to automatically recognize one another, link up, and process it’s own solution for lifting, that would be something truly awesome. I’m hoping that futher down the road they are programmed so that units can come and go from the swarm (battery charging, etc.) and ‘fall in’ to the solution.
Awesome video. If these things get any more capable….. I think it is time to hide the cat.
Reminds me of “fly teamwork”, a trick I read about in high school but never was able to try. The idea was to capture and chill/freeze a number of houseflies.
Upon thawing out, they find themselves glued to a matchstick or something similar, and they must work together to navigate their new airship. :p
Honestly, the thing I find the most worrysome about this is the sound they make! Sounds like a swarm of hornets!
I can easily see people of the future cowering to that sound…
@Richard
Man, that sounded really creepy until I re-read it. Measuring the motion of cadaver limbs?
WONDER STRUCK =O
(yeah that sound was like a World Cup game had broken out or something. ;)
I don’t think it’d take more than a few more iterations of code to get fault tolerance. One bot drops the wood, another swoops in to grab it. I’d like to see acrobatics with multiple copters. These guys are doing such an amazing job.
now they are just getting crazy with those things
@Charper Yeah, it’s pretty cool. And I get to singlehandedly figure out the control system and electronics. We have a machine that controls the vertical position of the hip and the X/Y position of the ankle, controls the force in 6 muscles and the three moments and three forces at the ankle.
We use a motion analysis rig similar to the one in the video to track the motion of a human walking and then make our machine move a cadaver’s leg the exact same way so that we can measure the internal forces.
The cameras should have sub-millimeter accuracy and I’ve tried a refresh rate of 1 kHz before. You can interface with the position data in real time and the software can tell you the rotation of different members of joints. It’s extremely powerful. If you guys are interested in how a lab like that would be set up I can take some pictures/screenshots.
it appears to me that the grippers are 4-point suction tubes, using a vacuum to “grip” the smooth pads.
I second the Manhack comment.
Definite flashbacks to Half Life 2.
Its even got the sound going.
Where’s my damn crowbar?
So that’s how win sounds like.
You know the research will be used for bad things, but I just hope they’ll allow it to be used for one or two good things too.
@Richard: Pictures and an explanation would be great, sounds really interesting. How are the Cameras interfaced ? What is the main PC running (OS / Programms) etc..
@Richard
That sounds awesome. What’s the purpose? And how long do your “samples” last?
This system still uses a bunch of cameras to determine motion. See the bright red LED’s?
http://www.vicon.com/products/cameras.html
@Zerth The specimens last several dozen experiments, it really depends on the nature of the trial. Sometimes we put a lot of force on the knee, which makes it wear out faster.
As for the purpose, we have a computer simulation of the human knee that models all of the ligaments and muscles and will deform accurately. The model could be used for coming up with ways to treat knee damage like a torn ACL. The motion analysis rig is to figure out the position of a person’s knee while they walk around. We then make the cadaver knee move the same way that the living person’s did and measure all of the internal forces. If the forces we measure match those predicted by the simulation then the simulation is good.
We also have a setup for figuring out the motion of the scapula, but that’s in the process of having the bugs worked out.
@Lars The cameras are connected to a dedicated computer provided by Vicon that works as a hub for the cameras and other sensors (EMG and force plates). They have a custom 8 pin circular connector, but I’m almost positive that they’re LAN. The PC is running XP and the program for motion analysis is Vicon Nexus. You can get the motion data in real time and interface with it in LabVIEW, which is probably how they control the quadcopters.
For pictures, I could post the links here or email them to you, whichever you would like. And I can go into more depth about the machine or the software or the electronics or the motion analysis if you would like.
@Richard: sounds very interesting! pictures would be great. i wonder how the “hub” computer handles all the data – do you know which hardware it has got ? and are there special pci-ex cards with propietary 8-pin plugs in there?
i can’t find any real sample images of the camera output. does it come in an uncompressed 8bit greyscale stream ? what container/codec is used for output/interfacing ?
my email is “lars”, then place an “@” and then the domain “lx-m.de”
(yea i do fear spambots ;)
Excellent ! Now i will have the means to get the remote on the table 20ft away after I have sat down !
Very cool.