Autonomous Quadcopter Fits in the Palm of your Hand

_5184952 [Horiken Engineering], which is made up of engineering students at the department of aerospace at the University of Tokyo have developed an autonomous quadcopter that requires no external control — and its tiny. By using two cameras and a sonar sensor, the quadcopter is capable of flying by itself due to its ability to process the data from the on-board sensors. To do the complex data processing fast enough to fly, it is using a Cortex-M4 MCU, a Spartan-6 FPGA, and 64MBs of DDRSDRAM. It also has the normal parts of a quadcopter, plus gyros, a 3D printed frame and a 3-axis compass. The following video demonstrates the quadcopter’s tracking ability above a static image (or a way point). The data you see in real-time is only the flight log, as the quadcopter receives no signal — it can only transmit data.

Is this the first step towards Amazon’s fleet of package delivering drones? It’s certainly going to be interesting when quadcopters are a common occurrence in public…

Comments

  1. Zee says:

    Woah monocular realtime camera that can do 3DOF or 6DOF tracking? On a microcontroller? What sorcery is this?

  2. Kerimil says:

    So does it use something like SIFT features to detect it’s spacial orientation and distance ? What if the surface below it isn’t flat or changes position and/or shape?

    • neon22 says:

      The algo only works with a stationary scene below. SIFT and SURF are the kind of thing used for consistent feature detection. THere is an opensource, patent unencumbered feature detector inside the Hugin project. Which is pretty neat all on its own…

  3. uri says:

    Electronic skeet . . .

  4. scorinth says:

    As an ME student trying to get into autonomous vehicles, I’m *dying* to see their algorithms.

  5. jioga says:

    One buckshot round and it’s fancy control systems won’t be of much use.

  6. LeonardMcCoy says:

    Quoting Dr. Leonard “Bones” McCoy – “fantastic machine ! no OFF switch !!”

  7. Rich Grise says:

    “Autonomous?” How does it decide what it wants to do?

  8. rasz says:

    if anyone wants to have a go

    http://www.robots.ox.ac.uk/~gk/PTAM/

    Looks like guys at Horiken Engineering managed to squeeze PTAM on an FPGA. Very impressive. Would love to be able to buy their FPGA alone, or see the code ( sadly Japanese arent exactly known for open sourcing).

  9. neon22 says:

    Its able to see how its moving relative to the B&W pattern. If it loses its way as it moves (after buffer fills up wih many more samples) – then it will be globally lost but still able to orient itself locally as long as there is contrast in the visual pattern it can see below.

    If you seriously are interested in this – then here’s a python based course (incl code) and video set. Its the full SLAM with bundle adjustments etc and pretty much state of the art. Implement in your embedded CPU of choice. http://www.youtube.com/playlist?list=PLpUPoM7Rgzi_7YWn14Va2FODh7LzADBSm
    Thanks Claus Brenner.

    I’d suggest the platforms polyhelo or openPilot revolution – both of which use Cortex M4 ARM cpus with FPU. They have many dof sensors on them.

    When micropython is up and running (Feb maybe?) then you might be able to use an STM32F429 (25 USD from STM) and code it all in python directly. (FYI micropython compiles down to native C for speed).

  10. M8R says:

    Behold! The fleet of terminator drones are being released by SKYNET!! To destroy all humans! everywhere on this planet!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 91,842 other followers