See The Science Behind VR Display Design, And What Makes A Problem Important

VR headsets are more and more common, but they aren’t perfect devices. That meant [Douglas Lanman] had a choice of problems to address when he joined Facebook Reality Labs several years ago. Right from the start, he perceived an issue no one seemed to be working on: the fact that the closer an object in VR is to one’s face, the less “real” it seems. There are several reasons for this, but the general way it presents is that the closer a virtual object is to the viewer, the more blurred and out of focus it appears to be. [Douglas] talks all about it and related issues in a great presentation from earlier this year (YouTube video) at the Electronic Imaging Symposium that sums up the state of the art for VR display technology while giving a peek at the kind of hard scientific work that goes into identifying and solving new problems.

Early varifocal prototype

[Douglas] chose to address seemingly-minor aspects of how the human eye and brain perceive objects and infer depth, and did so for two reasons: one was that no good solutions existed for it, and the other was that it was important because these cues play a large role in close-range VR interactions. Things within touching or throwing distance are a sweet spot for interactive VR content, and the state of the art wasn’t really delivering what human eyes and brain were expecting to see. This led to years of work on designing and testing varifocal and multi-focal displays which, among other things, were capable of presenting images in a variety of realistic focal planes instead of a single flat one. Not only that, but since the human eye expects things that are not in the correct focal plane to appear blurred (which is itself a depth cue), simulating that accurately was part of things, too.

The entire talk is packed full of interesting details and prototypes. If you have any interest in VR imaging and headset design and have a spare hour, watch it in the video embedded below.

Continue reading “See The Science Behind VR Display Design, And What Makes A Problem Important”

Arduino Provides Hands-Free Focus For Digital Inspection Scope

With surface-mount technology pushing the size of components ever smaller, even the most eagle-eyed among us needs some kind of optical assistance to do PCB work. Lots of microscopes have digital cameras too, which can be a big help – unless the camera fights you.

Faced with a camera whose idea of autofocus targets on didn’t quite coincide with his, [Scott M. Baker] took matters into his own hands – foot, actually – by replacing mouse inputs to the camera with an outboard controller. His particular camera’s autofocus can be turned off, but only via mouse clicks on the camera’s GUI. That’s disruptive while soldering, so [Scott] used an Arduino Pro Micro and a small keypad to mimic the mouse movements needed to control the camera.

At the press of a key, the Arduino forces the mouse cursor up to the top left corner of the screen, pulls down the camera menu, and steps down the proper distance to toggle autofocus. The controller can also run the manual focus in and out or to take a screenshot. There’s even a footswitch that forces the camera to refocus if the field of view changes. It looks really handy, and as usual [Scott] provides a great walkthrough in the video below.

Like it or not, if shrinking technology doesn’t force you into the microscope market, entropy will. If you’re looking for a buyer’s guide to microscopes, you could do worse than [Shahriar]’s roundup of digital USB scopes. Or perhaps you’d prefer to dumpster dive for yours.

Continue reading “Arduino Provides Hands-Free Focus For Digital Inspection Scope”

Kinect And Raspberry Pi Add Focus Pulling To DSLR

Prosumer DSLRs have been a boon to the democratization of digital media. Gear that once commanded professional prices is now available to those on more modest budgets. Not only has this unleashed a torrent of online content, it has also started a wave of camera hacks and accessories, like this automatic focus puller based on a Kinect and a Raspberry Pi.

For [Tom Piessens], the Canon EOS 5D has been a solid platform but suffers from a problem. The narrow depth of field possible with DSLRs makes it difficult to maintain focus on subjects that are moving relative to the camera, making follow-focus scenes like this classic hard to reproduce. Aiming for a better system than the stock autofocus, [Tom] grafted a Kinect sensor and a stepper motor actuator to a Raspberry Pi, and used the Kinect’s depth map to drive the focus ring. Parts are laser-cut, including a nice enclosure for the Pi and display that makes the whole thing reasonably portable. The video below shows the focus remaining locked on a selected region of interest. It seems like movement along only one axis is allowed; we’d love to see this system expanded to follow a designated object no matter where it moves in the frame.

If you’re in need of a follow-focus rig but don’t have a geared lens, check out these 3D-printed lens gears. They’d be a great complement to this backwoods focus-puller.

Continue reading “Kinect And Raspberry Pi Add Focus Pulling To DSLR”

3D Printed Lens Gears For Pro-grade Focus Pulling

Key Grip, Gaffer, Best Boy – any of us who’ve sat through every last minute of a Marvel movie to get to the post-credits scene – mmm, schawarma! – have seen the obscure titles of folks involved in movie making. But “Focus Puller”? How hard can it be to focus a camera?

Turns out there’s a lot to the job, and in a many cases it makes sense to mechanize the task. Pro cinematic cameras have geared rings for just that reason, and now your DSLR lens can have them too with customized, 3D printed follow-focus gears.

Gear_Selection_01_full_render_preview_featuredUnwilling to permanently modify his DSLR camera lens and dissatisfied with after-market lens gearing solutions, [Jaymis Loveday] learned enough OpenSCAD to generate gears from 50mm to 100mm in diameter in 0.5mm increments for a snug friction fit. Teamed up with commercially available focus pulling equipment, these lens gears should really help [Jaymis] get professional results from consumer lenses. 

Unfortunately, [Jaymis] doesn’t include any video of the gears in action, but the demo footage shown below presumably has some shots that were enabled by his custom gears. And even if it doesn’t, there are some really cool shots in it worth watching.

And for the budding cinematographers out there without access to a 3D printer, there’s always this hardware store solution to focus pulling.

Continue reading “3D Printed Lens Gears For Pro-grade Focus Pulling”

An External Autofocus For DSLRs

Most modern DSLR cameras support shooting full HD video, which makes them a great cheap option for video production. However, if you’ve ever used a DSLR for video, you’ve probably ran into some limitations, including sluggish autofocus.

Sensopoda tackles this issue by adding an external autofocus to your DSLR. With the camera in manual focus mode, the device drives the focus ring on the lens. This allows for custom focus control code to be implemented on an external controller.

To focus on an object, the distance needs to be known. Sensopoda uses the HRLV-MaxSonar-EZ ultrasonic sensor for this task. An Arduino runs a control loop that implements a Kalman filter to smooth out the input. This is then used to control a stepper motor which is attached to the focus ring.

The design is interesting because it is rather universal; it can be adapted to run on pretty much any DSLR. The full writeup (PDF) gives all the details on the build.

3D Scanning By Calculating The Focus Of Each Pixel

calculating-focus-to-generate-depth-map

We understand the concept [Jean] used to create a 3D scan of his face, but the particulars are a bit beyond our own experience. He is not using a dark room and laser line to capture slices which can be reassembled later. Nope, this approach uses pictures taken with several different focal lengths.

The idea is to process the photos using luminance. It looks at a pixel and it’s neighbors, subtracting the luminance and summing the absolute values to estimate how well that pixel is in focus. Apparently if you do this with the entire image, and a set of other images taken from the same vantage point with different focal lengths, you end up with a depth map of pixels.

What we find most interesting about this is the resulting pixels retain their original color values. So after removing the cruft you get a 3D scan that is still in full color.

If you want to learn more about laser-based 3D scanning check out this project.

[Thanks Luca]

Hacking The Computer Interface Of A Ford Focus Mk2

You can do some neat stuff to the way your Ford Focus Mk2 works, but first you have to gain access to the data system. If you know some Russian, and don’t mind a bit of dongle rewiring, this guide will have you hacking the car’s CAN bus in no time. It was written by [Preee] and he has already added Radio RDS and CD Track information to the speedometer display panel, implemented hands free control for his cellphone, disabled the sounds the car makes when he goes into reverse, changed the door locking speed from 5mph to 10mph, and much more.

To gain access to the system you need hardware to bridge from a computer to the CAN bus. He hit eBay and bought an ELM327 cable which plugs into the On-Board Diagnostics port (ODBII). There are two different ways these dongles can be configured and since this isn’t the right one for the Focus he had to alter it. His hardware changes are illustrated in the second post of the forum thread. Instead of just switching over to the other configuration, he wired up a toggle switch to select between the two.

With hardware in place he grabbed some software and started hacking away. But as we hinted above, it’s not as simple as you might think. The software is in Russian. [Preee] did his best to add translations to a few screenshots, but it’s still going to be a bit of a bother trying to find your way around the GUI.

[Thanks Fred]