Seek And Ye Shall Command

If we count all the screens in our lives, it takes a hot minute. Some of them are touchscreens, some need a mouse or keyboard, but we are accustomed to all the input devices. Not everyone can use the various methods, like cerebral palsy patients who rely on eye-tracking hardware. Traditionally, that only works on the connected computer, so switching from a chair-mounted screen to a tablet on the desk is not an option. To give folks the ability to control different computers effortlessly [Zack Freedman] is developing a head-mounted eye-tracker that is not tied to one computer. In a way, this is like a KVM switch, but way more futuristic. [Tony Stark] would be proud.

An infrared detector on the headset identifies compatible screens in line of sight and synchs up with its associated HID dongle. A headset-mounted color camera tracks the head position in relation to the screen while an IR camera scans the eye to calculate where the user is focusing. All the technology here is proven, but this new recipe could be a game-changer to anyone who has trouble with the traditional keyboard, mouse, and touchscreen. Maybe QR codes could assist the screen identification and orientation like how a Wii remote and sensor bar work together.

Give Me A Minute, My Eyes Are Busy

Social cues are tricky, but humans are very good at detecting where someone is looking; that goes a long way toward figuring out where someone is placing their attention. All of this goes right out the window though, when you’re talking with somebody who uses eye-tracking software to speak. [Matthew Oppenheim] with Lancaster University, UK wants to give listeners the message of Give Me a Minute with an easy-to-recognize indicator. His choice is a microBit, which displays a rotating arrow on the LED array while someone composes their speech. He chose the microBit because they are readily available, and you can get cases to fit people’s personalities. After the break, you can see a demonstration, but the graphic appears scrambled because of the screen flicker. The rotating arrow is a clear indicator that someone is writing, whereas a clock might suggest a frozen computer, and a progress bar could not be accurate.

[Matthew] wrote a program for the interpreting computer which recognizes when a message is forming by monitoring the number of black pixels in the composition field. If it changes, someone must be composing a sentence. Many people will try to peek over the speaker’s shoulder and see if they are working, but we’re sure that most readers would join the users of such tech in being unhappy if someone blatantly looks at theirr computer screen while they are typing.

Wheelchairs don’t always have to come from a hospital or supply store, and they don’t have to stay on the ground.

Continue reading Give Me A Minute, My Eyes Are Busy”

Open Source Headset With Inside-Out Tracking, Video Passthrough

The folks behind the Atmos Extended Reality (XR) headset want to provide improved accessibility with an open ecosystem, and they aim to do it with a WebVR-capable headset design that is self-contained, 3D-printable, and open-sourced. Their immediate goal is to release a development kit, then refine the design for a wider release.

An early prototype of the open source Atmos Extended Reality headset.

The front of the headset has a camera-based tracking board to provide all the modern goodies like inside-out head and hand tracking as well as the ability to pass through video. The design also provides for a variety of interface methods such as eye tracking and 6 DoF controllers.

With all that, the headset gives users maximum flexibility to experiment with and create different applications while working to keep development simple. A short video showing off the modular design of the HMD and optical assembly is embedded below.

Extended Reality (XR) has emerged as a catch-all term to cover broad combinations of real and virtual elements. On one end of the spectrum are completely virtual elements such as in virtual reality (VR), and towards the other end of the spectrum are things like augmented reality (AR) in which virtual elements are integrated with real ones in varying ratios. With the ability to sense the real world and pass through video from the cameras, developers can choose to integrate as much or as little as they wish.

Terms like XR are a sign that the whole scene is still rapidly changing and it’s fascinating to see how development in this area is still within reach of small developers and individual hackers. The Atmos DK 1 developer kit aims to be released sometime in July, so anyone interested in getting in on the ground floor should read up on how to get involved with the project, which currently points people to their Twitter account (@atmosxr) and invites developers to their Discord server. You can also follow along on their newly published Hackaday.io page.

Continue reading “Open Source Headset With Inside-Out Tracking, Video Passthrough”

Low-Cost Eye Tracking With Webcams And Open-Source Software

“What are you looking at?” Said the wrong way, those can be fighting words. But in fields as diverse as psychological research and user experience testing, knowing what people are looking at in real-time can be invaluable. Eye-tracking software does this, but generally at a cost that keeps it out of the hands of the home gamer.

Or it used to. With hacked $20 webcams, this open source eye tracker will let you watch how someone is processing what they see. But [John Evans]’ Hackaday Prize entry is more than that. Most of the detail is in the video below, a good chunk of which [John] uses to extol the virtues of the camera he uses for his eye tracker, a Logitech C270. And rightly so — the cheap and easily sourced camera has remarkable macro capabilities right out of the box, a key feature for a camera that’s going to be trained on an eyeball a few millimeters away. Still, [John] provides STL files for mounts that snap to the torn-down camera PCB, in case other focal lengths are needed.

The meat of the project is his Jevons Camera Viewer, an app he wrote to control and view two cameras at once. Originally for a pick and place, the software can be used to coordinate the views of two goggle-mounted cameras, one looking out and one focused on the user’s eye. Reflections from the camera LED are picked up and used to judge the angle of the eye, with an overlay applied to the other camera’s view to show where the user is looking. It seems quite accurate, and plenty fast to boot.

We think this is a great project, like so many others in the first round of the 2018 Hackaday Prize. Can you think of an awesome project based on eye tracking? Here’s your chance to get going on the cheap.

Continue reading “Low-Cost Eye Tracking With Webcams And Open-Source Software”

Redirected Walking In VR Done Via Exploit Of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking In VR Done Via Exploit Of Eyeballs”

IoT Doorman: Eye-Controlled Door For A Girl With Cerebral Palsy

Kyleigh has an eye-controlled computer on her wheelchair but something as simple as her bedroom door was still beyond her reach… until now! [Bill Binko], recently filmed a demo of an automatic, IoT door opener built for the young girl with cerebral palsy. [Bill] is a co-founder of ATMakers, an organization that enables makers interested in assistive technologies to collaborate with users to improve quality of life.

Using her eye tracking tablet (PRC Device), Kyleigh has two new icons that make the relevant call to a website, pushing a simple command to either open or close her bedroom door. The device attached to the door uses an Adafruit M0 WiFi Feather board, a DC stepper motor and wheel, a UBEC buck converter, and a potentiometer.

Since other family members are also going to be opening and closing the door, there’s potentiometer which measures the door position for proper operation next time Kyleigh wishes to use the door. The installation also maintains a fairly inconspicuous profile for the assistance it gives — the ‘brain’ is enclosed in a small box on the door, with the motor only slightly larger on the door’s base.

[Bill] believes the project has a few quibbles and wants to work out a smaller wait before the open/close process is executed and optimizing the open/close speed. You have to check out the video below to see that it works really really. We’re also excited to see Kyleigh using her gaze control to talk to an Amazon Echo. [Bill] foresee a door control improvement that links it to Alexa. And how much did it cost to improve the quality of life for this young girl? $70.

We love seeing makers help people, and cannot wait to see what 2018 will bring! If you’re looking for more inspiration, don’t miss the eye-controlled wheelchair project called Eyedrivomatic which won the 2015 Hackaday Prize. There’s also the top Assistive Technology projects from the Hackaday Prize.

Continue reading “IoT Doorman: Eye-Controlled Door For A Girl With Cerebral Palsy”

Hackaday Prize Entry: Real Life XEyes

There’s a lot of tech that goes into animatronics, cosplay, and costumes. For their Hackaday Prize entry, [Dasaki] and [Dylan] are taking the eyes in a costume or Halloween prop to the next level with animatronic eyes that look where the wearer of this crazy confabulation is looking. It’s XEyes in real life, and it promises to be a part of some very, very cool costumes.

The mechanics of this system are actually pretty simple — it’s just a few servos joined together to make a pair of robotic eyes move up and down, and left to right. This entire mechanism is mounted on a frame, to which is attached a very small camera pointed directly at the user’s (real) eye. The software is where things get fun. That’s a basic eye-tracking setup, with IR light illuminating the pupil, and a compute unit that can calculate where the user is looking.

For the software, [Dasaki] and [Dylan] have collected a bunch of links, but right now the best solutions are the OpenMV and the Eye of Horus project from last year’s Hackaday Prize. It’s a great project, and a really fun entry for the Automation portion of this year’s Hackaday Prize.