There’s a lot of tech that goes into animatronics, cosplay, and costumes. For their Hackaday Prize entry, [Dasaki] and [Dylan] are taking the eyes in a costume or Halloween prop to the next level with animatronic eyes that look where the wearer of this crazy confabulation is looking. It’s XEyes in real life, and it promises to be a part of some very, very cool costumes.
The mechanics of this system are actually pretty simple — it’s just a few servos joined together to make a pair of robotic eyes move up and down, and left to right. This entire mechanism is mounted on a frame, to which is attached a very small camera pointed directly at the user’s (real) eye. The software is where things get fun. That’s a basic eye-tracking setup, with IR light illuminating the pupil, and a compute unit that can calculate where the user is looking.
For the software, [Dasaki] and [Dylan] have collected a bunch of links, but right now the best solutions are the OpenMV and the Eye of Horus project from last year’s Hackaday Prize. It’s a great project, and a really fun entry for the Automation portion of this year’s Hackaday Prize.









