Eye-Tracking Device Is A Tiny Movie Theatre For Jumping Spiders

The eyes are windows into the mind, and this research into what jumping spiders look at and why required a clever device that performs eye tracking, but for jumping spiders. The eyesight of these fascinating creatures in some ways has a lot in common with humans. We both perceive a wide-angle region of lower visual fidelity, but are capable of directing our attention to areas of interest within that to see greater detail. Researchers have been able to perform eye-tracking on jumping spiders, literally showing exactly where they are looking in real-time, with the help of a custom device that works a little bit like a miniature movie theatre.

A harmless temporary adhesive on top (and a foam ball for a perch) holds a spider in front of a micro movie projector and IR camera. Spiders were not harmed in the research.

To do this, researchers had to get clever. The unblinking lenses of a spider’s two front-facing primary eyes do not move. Instead, to look at different things, the cone-shaped inside of the eye is shifted around by muscles. This effectively pulls the retina around to point towards different areas of interest. Spiders, whose primary eyes have boomerang-shaped retinas, have an X-shaped region of higher-resolution vision that the spider directs as needed.

So how does the spider eye tracker work? The spider perches on a tiny foam ball and is attached — the help of a harmless and temporary adhesive based on beeswax — to a small bristle. In this way, the spider is held stably in front of a video screen without otherwise being restrained. The spider is shown home movies while an IR camera picks up the reflection of IR off the retinas inside the spider’s two primary eyes. By superimposing the IR reflection onto the displayed video, it becomes possible to literally see exactly where the spider is looking at any given moment. This is similar in some ways to how eye tracking is done for humans, which also uses IR, but watches the position of the pupil.

In the short video embedded below, if you look closely you can see the two retinas make an X-shape of a faintly lighter color than the rest of the background. Watch the spider find and focus on the silhouette of a tasty cricket, but when a dark oval appears and grows larger (as it would look if it were getting closer) the spider’s gaze quickly snaps over to the potential threat.

Feel a need to know more about jumping spiders? This eye-tracking research was featured as part of a larger Science News article highlighting the deep sensory spectrum these fascinating creatures inhabit, most of which is completely inaccessible to humans.

Continue reading “Eye-Tracking Device Is A Tiny Movie Theatre For Jumping Spiders”

Hands-Free Page Turning

For people who can’t lift a finger to turn the page on their ebooks, a solution is at hand. Seoul based technology company Visual Camp has adapted their eye tracking algorithms to an ebook reader. (Video, embedded below.) Reportedly this is the first time an ebook reader has been so equipped.

If your eye lingers on the page turn button, it will turn the page. While this particular application seems innocuous, some of the other applications being touted seem a little contrived if not invasive. For example, applying gaze analysis while you are reading a book, they claim to be able to make targeted recommendations for other books.

We’ve discussed eye tracking devices before, but they have utilized hardware. Visual Camp claims their AI-based technology only requires a color camera and can be integrated into existing camera-equipped devices, such an this ebook reader. They also offer a SDK for developers who want to add eye tracking control into their apps. Eye tracking is hard, though, and the devil is in the details. It’d be neat to see what they’re up to.

Continue reading “Hands-Free Page Turning”

Seek And Ye Shall Command

If we count all the screens in our lives, it takes a hot minute. Some of them are touchscreens, some need a mouse or keyboard, but we are accustomed to all the input devices. Not everyone can use the various methods, like cerebral palsy patients who rely on eye-tracking hardware. Traditionally, that only works on the connected computer, so switching from a chair-mounted screen to a tablet on the desk is not an option. To give folks the ability to control different computers effortlessly [Zack Freedman] is developing a head-mounted eye-tracker that is not tied to one computer. In a way, this is like a KVM switch, but way more futuristic. [Tony Stark] would be proud.

An infrared detector on the headset identifies compatible screens in line of sight and synchs up with its associated HID dongle. A headset-mounted color camera tracks the head position in relation to the screen while an IR camera scans the eye to calculate where the user is focusing. All the technology here is proven, but this new recipe could be a game-changer to anyone who has trouble with the traditional keyboard, mouse, and touchscreen. Maybe QR codes could assist the screen identification and orientation like how a Wii remote and sensor bar work together.

Give Me A Minute, My Eyes Are Busy

Social cues are tricky, but humans are very good at detecting where someone is looking; that goes a long way toward figuring out where someone is placing their attention. All of this goes right out the window though, when you’re talking with somebody who uses eye-tracking software to speak. [Matthew Oppenheim] with Lancaster University, UK wants to give listeners the message of Give Me a Minute with an easy-to-recognize indicator. His choice is a microBit, which displays a rotating arrow on the LED array while someone composes their speech. He chose the microBit because they are readily available, and you can get cases to fit people’s personalities. After the break, you can see a demonstration, but the graphic appears scrambled because of the screen flicker. The rotating arrow is a clear indicator that someone is writing, whereas a clock might suggest a frozen computer, and a progress bar could not be accurate.

[Matthew] wrote a program for the interpreting computer which recognizes when a message is forming by monitoring the number of black pixels in the composition field. If it changes, someone must be composing a sentence. Many people will try to peek over the speaker’s shoulder and see if they are working, but we’re sure that most readers would join the users of such tech in being unhappy if someone blatantly looks at theirr computer screen while they are typing.

Wheelchairs don’t always have to come from a hospital or supply store, and they don’t have to stay on the ground.

Continue reading Give Me A Minute, My Eyes Are Busy”

Open Source Headset With Inside-Out Tracking, Video Passthrough

The folks behind the Atmos Extended Reality (XR) headset want to provide improved accessibility with an open ecosystem, and they aim to do it with a WebVR-capable headset design that is self-contained, 3D-printable, and open-sourced. Their immediate goal is to release a development kit, then refine the design for a wider release.

An early prototype of the open source Atmos Extended Reality headset.

The front of the headset has a camera-based tracking board to provide all the modern goodies like inside-out head and hand tracking as well as the ability to pass through video. The design also provides for a variety of interface methods such as eye tracking and 6 DoF controllers.

With all that, the headset gives users maximum flexibility to experiment with and create different applications while working to keep development simple. A short video showing off the modular design of the HMD and optical assembly is embedded below.

Extended Reality (XR) has emerged as a catch-all term to cover broad combinations of real and virtual elements. On one end of the spectrum are completely virtual elements such as in virtual reality (VR), and towards the other end of the spectrum are things like augmented reality (AR) in which virtual elements are integrated with real ones in varying ratios. With the ability to sense the real world and pass through video from the cameras, developers can choose to integrate as much or as little as they wish.

Terms like XR are a sign that the whole scene is still rapidly changing and it’s fascinating to see how development in this area is still within reach of small developers and individual hackers. The Atmos DK 1 developer kit aims to be released sometime in July, so anyone interested in getting in on the ground floor should read up on how to get involved with the project, which currently points people to their Twitter account (@atmosxr) and invites developers to their Discord server. You can also follow along on their newly published Hackaday.io page.

Continue reading “Open Source Headset With Inside-Out Tracking, Video Passthrough”

Low-Cost Eye Tracking With Webcams And Open-Source Software

“What are you looking at?” Said the wrong way, those can be fighting words. But in fields as diverse as psychological research and user experience testing, knowing what people are looking at in real-time can be invaluable. Eye-tracking software does this, but generally at a cost that keeps it out of the hands of the home gamer.

Or it used to. With hacked $20 webcams, this open source eye tracker will let you watch how someone is processing what they see. But [John Evans]’ Hackaday Prize entry is more than that. Most of the detail is in the video below, a good chunk of which [John] uses to extol the virtues of the camera he uses for his eye tracker, a Logitech C270. And rightly so — the cheap and easily sourced camera has remarkable macro capabilities right out of the box, a key feature for a camera that’s going to be trained on an eyeball a few millimeters away. Still, [John] provides STL files for mounts that snap to the torn-down camera PCB, in case other focal lengths are needed.

The meat of the project is his Jevons Camera Viewer, an app he wrote to control and view two cameras at once. Originally for a pick and place, the software can be used to coordinate the views of two goggle-mounted cameras, one looking out and one focused on the user’s eye. Reflections from the camera LED are picked up and used to judge the angle of the eye, with an overlay applied to the other camera’s view to show where the user is looking. It seems quite accurate, and plenty fast to boot.

We think this is a great project, like so many others in the first round of the 2018 Hackaday Prize. Can you think of an awesome project based on eye tracking? Here’s your chance to get going on the cheap.

Continue reading “Low-Cost Eye Tracking With Webcams And Open-Source Software”

Redirected Walking In VR Done Via Exploit Of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking In VR Done Via Exploit Of Eyeballs”