Webcam Eye-tracking Moves Robot-powered Skittles Candy

robot-powered-skittles

This is a great hack, and it’s an advertisement. We wish this were the norm when it comes to advertising because they’ve really got our number. Skittles enlisted a few engineers to build a web interface that moves robot-powered candies.

When we started looking into this we figured that a few robots were covered with over-sized cases that looked like Skittles. But that’s not it at all. What you see above is actually upside down. The top side of the white surface has one tiny wheeled robot for each candy. A magnet was embedded in each Skittle which holds it to the underside of the surface. The user interface was rolled out on a Facebook page. It uses a common webcam for eye tracking. When you move your eyes, the robot controlling your assigned candy moves in that direction. See for yourself in the cllip after the break.

So we say bravo Mars Inc. We love it that you decided to show off what’s behind to curtain. As with the Hyundai pixel wall, there’s a whole subset of people who might ignore the ad, but will spend a lot of time to find out how it was done.

Continue reading “Webcam Eye-tracking Moves Robot-powered Skittles Candy”

Tracking Eye Movement By Measuring Electrons In The Eye

[Luis Cruz] is a Honduran High School student, and he built an amazing electrooculography system, and the writeup (PDF warning) of the project is one of the best we’ve seen.

[Luis] goes through the theory of the electrooculogram – the human eye is polarized from front to back because of a negative charge in the nerve endings in the retina. Because of this minute difference in charge, a user’s gaze can be tracked by electrodes attached to the skin around the eye. After connecting eye electrodes to opamps and a microcontroller, [Luis] imported the data with a Python script and wrote an “eyeboard” application to enable text input using only eye movement. The original goal of the project was to build an interface for severely disabled people, but [Luis] sees applications for sleep research and gathering marketing data.

We covered [Luis]’ homebrew 8-bit console last year, and he’s now controlling his Pong clone with his eye-tracking device. We’re reminded of a similar system developed by Atari, but [Luis]’ system uses a method that won’t give the user a headache after 15 minutes.

Check out [Luis] going through the capabilities of his interface after the break. Continue reading “Tracking Eye Movement By Measuring Electrons In The Eye”

Open Source Tracker Keeps An Eye On Furry Friends

Most of the time, you’ll know where your cats are — asleep on the bed about 23.5 hours a day and eating or pooping the rest of the time. But some cats are more active than others, so there’s commercial options for those who want to keep tabs on their pet. Unfortunately, [Sahas Chitlange] didn’t like any of them, so he designed and built his own open source version: FindMyCat.io.

The system is in two parts: a module that fits onto a cat collar, and a home station that, well, stays at home. It offers a variety of tracking modes. In home mode, the home station signals the collar every 10 seconds, which stays in a deep sleep most of the time. If the collar doesn’t get a signal from the home station, it switches to ping mode, where it will wait for a signal from the FindMyCat over the LTE-M connection and report its location.

Finally, the app can set the collar to Lost Kitteh mode, where the collar will send a location to the app every seven minutes or thirty seconds. The collar also supports a direction-finding feature, using the ultra wideband (UWB) feature of recent Apple iPhones to point you in the direction and distance of the tracked cat.

The collar is built around a Nordic Semiconductor NRF-9160, a System in a Package (SiP) that does most of the heavy lifting as it includes GPS, an LTE-M modem, and an ARM processor. One interesting feature here: [Sahas] doesn’t make his antennas on the PCB, but instead uses an Ignion NN03-310, an off-the-shelf antenna that is already qualified for LTE-M use. That means this system can be connected to almost any LTE-M network without getting yelled at for using unqualified hardware and making the local cell towers explode.

The collar also includes a DWM3001CDK ultrawideband (UWB) module used for the locator feature. The accompanying app uses this and Apple’s UWB support to show the user which direction the cat is in, and how far away it is. The app isn’t in the Apple App Store yet, so you’ll need to sign up for an Apple Developer account to use it. We’d love to hear from anyone who takes it for a test drive with their own pet.

Continue reading “Open Source Tracker Keeps An Eye On Furry Friends”

A ginger cat, wearing a blue harness with a brass and wooden box on its back

Handmade GPS Tracker Keeps An Eye On Adventurous Cats

One of the most convenient things about having cats is their independent lifestyle: most are happy to enjoy themselves outside all day, only coming back home when it’s time for dinner and a nap. What your cat gets up to during the day remains a mystery, unless you fit it with a GPS collar. When [Sahas Chitlange] went searching for a GPS tracker for his beloved Pumpkin, he found that none were exactly to his liking: too slow, too big, or simply unreliable. This led him to design and build his own, called Find My Cat.

Continue reading “Handmade GPS Tracker Keeps An Eye On Adventurous Cats”

UV Monitoring Budgie Keeps An Eye On Exposure Levels

UV rays are great at helping us generate vitamin D, but they can also be harmful, causing sunburn and even melanoma. To help kids keep track of the UV index in his local area, [Jude Pullen] created the UV Budgie.

The build is based around an Arduino Nano 33 IoT board, which queries the Met Office’s API to determine the UV level in the area. The relevant data is then displayed on a small e-ink display, with cute little sun characters telling you about the prevailing conditions. It also announces the current risk level with recorded voice samples, advising on whether precautions should be taken, such as using sunscreen or sheltering inside for the worst days. Plus, there’s a bird that flaps its wings to announce an update, actuated by a small servo in the base.

It’s a fun build that should help [Jude] and his family remain sun safe in the summer. [Jude] notes the build could also be reprogrammed to share other warnings, too. APIs to query local air quality or radiation levels are just some of the ideas that come to mind. Video after the break.

Continue reading “UV Monitoring Budgie Keeps An Eye On Exposure Levels”

A tiny CRT showing an eye, inside a plexiglass enclosure

This Eye Is Watching You From Its Tiny CRT

The days of cathode ray tubes, or CRTs, are firmly behind us, and that’s generally a good thing. Display tubes were heavy, bulky and fragile, and needed complicated high-voltage electronics in order to work. But not all of them were actually large: miniature display tubes were also produced, for things like camcorder viewfinders, and [Tavis] from Sideburn Studios decided to turn one of those into a slightly creepy art project.

The heart of this build is a one-inch CRT that was salvaged from an RCA video camera. [Tavis] mounted the tiny tube inside an acrylic box on a 3D printed base. Inside that base sits a Raspberry Pi along with a high-voltage driver and a power management board. The Pi continuously plays a video that shows a human eye blinking and looking in various directions. Just an eye, floating in space, looking at the world around it.

The magic is briefly lost when the Pi starts up, because it then shows a microscopic version of the Pi’s standard bootup sequence, but once the thing is running it adds a weird vibe to a room. It actually looks like something you’d find in an avant-garde art exhibition — in the video (embedded below) it’s accompanied by eerie music that gives it an even more unsettling feel. Electronic eyes are always a bit scary, especially when they’re actually looking at you.

Continue reading “This Eye Is Watching You From Its Tiny CRT”

TapType: AI-Assisted Hand Motion Tracking Using Only Accelerometers

The team from the Sensing, Interaction & Perception Lab at ETH Zürich, Switzerland have come up with TapType, an interesting text input method that relies purely on a pair of wrist-worn devices, that sense acceleration values when the wearer types on any old surface. By feeding the acceleration values from a pair of sensors on each wrist into a Bayesian inference classification type neural network which in turn feeds a traditional probabilistic language model (predictive text, to you and I) the resulting text can be input at up to 19 WPM with 0.6% average error. Expert TapTypers report speeds of up to 25 WPM, which could be quite usable.

Details are a little scarce (it is a research project, after all) but the actual hardware seems simple enough, based around the Dialog DA14695 which is a nice Cortex M33 based Bluetooth Low Energy SoC. This is an interesting device in its own right, containing a “sensor node controller” block, that is capable of handling sensor devices connected to its interfaces, independant from the main CPU. The sensor device used is the Bosch BMA456 3-axis accelerometer, which is notable for its low power consumption of a mere 150 μA.

User’s can “type” on any convenient surface.

The wristband units themselves appear to be a combination of a main PCB hosting the BLE chip and supporting circuit, connected to a flex PCB with a pair of the accelerometer devices at each end. The assembly was then slipped into a flexible wristband, likely constructed from 3D printed TPU, but we’re just guessing really, as the progression from the first embedded platform to the wearable prototype is unclear.

What is clear is that the wristband itself is just a dumb data-streaming device, and all the clever processing is performed on the connected device. Training of the system (and subsequent selection of the most accurate classifier architecture) was performed by recording volunteers “typing” on an A3 sized keyboard image, with finger movements tracked with a motion tracking camera, whilst recording the acceleration data streams from both wrists. There are a few more details in the published paper for those interested in digging into this research a little deeper.

The eagle-eyed may remember something similar from last year, from the same team, which correlated bone-conduction sensing with VR type hand tracking to generate input events inside a VR environment.

Continue reading “TapType: AI-Assisted Hand Motion Tracking Using Only Accelerometers”