3D Drone Video

If you enjoy flying quadcopters, it is a good bet that you’ll have a drone with a camera. It used to be enough to record a video for later viewing, but these days you really want to see a live stream. The really cool setups have goggles so you can feel like you are actually in the cockpit. [Andi2345] decided to go one step further and build a drone that streams 3D video. You can see a video of the system, below.

Outdoors, there’s probably not a lot of advantage to having a 3D view, but it ought to be great for a small indoor drone. The problem is, of course, a small drone doesn’t have a lot of capacity for two cameras. The final product uses two cameras kept in sync with a sync separator IC and a microcontroller, while an analog switch intersperses the frames.

On the viewing side, a USB frame grabber and a Raspberry Pi splits the images again. At first, the system used an LCD screen married with a Google Cardboard-style goggle, but eventually, this became a custom Android application.

The two cameras work best if you can hack them to use the same clock. If you don’t, then one image may appear to roll. This might be less disturbing if you rotate the cameras so that the roll goes right to left instead of up and down. The system introduces about 100 ms of delay which isn’t perfect but is workable, apparently.

Small drones are easy to find, or you can build your own. We’ve seen cameras ganged for 3D before, but never at this small of a scale.

10 thoughts on “3D Drone Video

  1. Hmm. True, depth perception from two closely-spaced cameras mounted on a single quadcopter might be lousy at any significant altitude…

    But, heck, with the formation-flight capability already demonstrated in things like Intel’s “Shooting Star” drone swarms, it would be simple to put cameras on two separate craft, and fly them in formation any any given separation to get a very nice stereo separation. Frame sync can be got from GPS timing or the ground-based pilot if you’re picky, but it probably isn’t even necessary.

    ObXKCD: https://xkcd.com/941/

    1. You don’t really need depth information for things that are very far away – it’s not like you’re going to crash into them. Once something gets close enough to potentially hit you, it’ll also be close enough for stereo.

      1. The primary problem with “drones” is that they’re good only for inspection from a distance. This will add close-up capability and make things like searching a house or disaster site a real possibility plus allow for automation to take over (and improve) the inspection in much more confined areas than were possible before.

  2. This might be really useful for freestyle FPV, when I flew my quad I had problems of getting picture of my location in the current space because of the fisheye distortio. Hope this does not have >50ms lag. On mobile it has around 200ms, which is too much, then it would be better to assisted flying rather than acro.

  3. pretty much off-topic, but does anyone know where we can find the “open source” North Star by Leap Motion?
    They made the 2 blog posts on april 9th, and anounced the sources to follow within the next week, but this never happened? Seems like it was just advertisement for the Leap Motion itself…

  4. I have often considered just putting one of the clip on phone 3D adapters (some mirrors that split the image) over the camera and just using a normal cheap “put your phone in” VR headset over the screen to view it. Sadly the amount of picture lost to those cheap adapters is more than I could put up with so I really like the dual camera sync idea :)

  5. Nice, but no more.
    The problem is and always will be lag.
    As soon as you introduce any digital components, you almost certainly end up with lag, and anything above 200µs is not acceptable for FPV flying… some even say this is too much,

    It has been tried many times by hobbyists and even by enterprises, but till now with not much success.. always wondered why FPV ggear is still analog in 2018? Lag is why.

    1. You mean milliseconds, not microseconds. Analog NTSC FPV doesn’t get anywhere near as low as 200 microseconds. You CAN get digital latency just as low as the typical analog FPV latency, but it usually means a very expensive custom solution. But it is available for FPV. Luckily, this is exactly the same problem as VR (especially considering VR headsets often include front-facing external cameras nowadays), so I suspect that long-term it will become cheaper to get wireless digital video latency down to an acceptable range. And VR is higher framerate (90 fps is standard, 120fps is available) than FPV (nowadays 25 to 30 fps actual transmission speed depending on PAL or NTSC), so fundamentally it can get even lower latency than analog FPV.

      It just means you can’t rely on the typical easy hacker toolset of a Raspberry Pi or Android phones and such. You need digital components and a digital pipeline actually made for low latency, or you’ll get all the high latency you just mentioned. And with care, a digital FPV solution could blow analog out of the water as far as latency goes.

    1. No, they are mirrored. You can adjust this in the android app. The latency is about 100 ms. So it’s flyable but not for racing drones. Have a look at instructables or youtube for further details.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s