Glasses are perhaps the most non-invasive method of vision correction, followed by contact lenses. Each have their drawbacks though, and some seek more permanent solutions in the form of laser eye surgeries like LASIK, aiming to reshape their corneas for better visual clarity. However, these methods often involve cutting into the eye itself, and it hardly gets any more invasive than that.
[Enza3D] shows off a surprisingly compact articulated animatronic eyeball that can be intuitively controlled with a Wii nunchuk controller. The design uses 3D printed parts and some tiny servos, and all of the necessary electronics can be easily purchased online. The mechanical design of the eye is very impressive, and [Enza3D] walks through several different versions of the design, the end result of which is a tidy little assembly that would fit nicely into masks, costumes, or other projects.
A Wii nunchuk is ideal for manual control of such a device, thanks to its ergonomic design and ease of interface (the nunchuk communicates over I2C, which is easily within the reach of even most modest of microcontrollers.) Of course, since driving servos is also almost trivial nowadays, it doesn’t look like working this into an automated project would pose much of a challenge.
The devices themselves consist of electrodes implanted into the retina, which can send signals to the nervous system which appear as spots of light to the user. A camera feed is used to capture images which are then translated into signals sent to the retinal electrodes. The results are low-resolution to say the least, and the vision supplied is crude, but it gives users that are blind a rudimentary sense that they never had before. It’s very much a visual equivalent to the cochlear implant technology.
The story is altogether too familiar; Second Sight Medical Products came out with a cutting-edge device, raised money and put it out into the world, only to go bankrupt down the road, leaving its users high and dry. Over 350 people have the implants fitted in one eye, while one Terry Byland is the sole person to have implants in both his left and right eyeballs. Performance of the device was mixed, with some users raving about the device while others questioned its utility.
Eyeballs are often watching us, but they’re usually embedded in the skull of another human or animal. When they’re staring at you by themselves, they can be altogether more creepy. This Halloween project from [allpartscombined] aims to elicit that exact spooky vibe.
The project relies on a Kinect V2 to do body tracking. It feeds data to a Unity app that figures out how to aim the eyeball at any humans detected in the scene. The app sends angle data to an Arduino over serial, with the microcontroller generating the necessary signals to command servos which move the eyeball.
With tilt and pan servos fitted and the precision tracking from the Kinect data, the eye can be aimed at people in two dimensions. It’s significantly spookier than simply panning the eye back and forth.
The build was actually created by modifying an earlier project to create an airsoft turret, something we’ve seen a few times around these parts. Fundamentally, the tracking part is the same, just in this case, the eye doesn’t shoot at people… yet! Video after the break.
Wouldn’t it be nice if every webcam had a hardware switch? Especially for those built-in webcams like the one in your laptop. Since they don’t have switches yet, we’re just stuck trying to remember to turn them off or re-apply the sticker after every meeting. [Becky Stern] was tired of trying to remember to blind the all-seeing eye, and decided to make a robot companion that would do it for her.
Essentially, a servo-driven, 3D-printed eyelid covers the eye’s iris and also the web cam directly underneath. At first, we though [Becky] had liberated the business parts of a cheap webcam and built it into the eyeball, but this is far less intrusive. The eyeball simply sits atop the monitor, and [Becky] can control the eyelid two ways: she can set a timer with the potentiometer to close it automatically after some number of minutes, or else do it on demand using the momentary button. We’d love to see it tied directly to Zoom and or whatever else [Becky] uses regularly. Be sure to check out the build and demo video after the break to see it in action.
Like many of us, [Emily’s Electric Oddities] has had a lot of time for projects over the past year or so, including one that had been kicking around since late 2018. It all started at the Hackaday Superconference, when [Emily] encountered the Adafruit Hallowing board in the swag bag. Since that time, [Emily] has wanted to display the example code eyeball movement on a CRT, but didn’t really know how to go about it. Spoiler alert: it works now.
See? It’s educational.
Eventually, [Emily] learned about the TV out library for Arduino and got everything working properly — the eyeball would move around with the joystick, blink when the button is pressed, and the pupil would respond visually to changes in ambient light. The only problem was that the animation moved at a lousy four frames per second. Well, until she got Hackaday’s own [Roger Cheng] involved.
[Roger] was able to streamline the code to align with [Emily]’s dreams, and then it was on to our favorite part of this build — the cabinet design. Since the TV out library is limited to black and white output without shades of gray, Emily took design cues from the late 70s/early 80s, particularly the yellow and wood of the classic PONG cabinet. We love it!
Is Your Pet Eye the worst video game ever, as [Emily] proclaims it to be? Not a chance, and we’re pretty sure that the title still rests with Desert Bus, anyway. Even though the game only lasts until the eye gets tired and goes to sleep, it’s way more fun than Your Pet Rock. Don’t miss the infomercial/explanation/demonstration video after the break. If one video is just not enough, learn more about [Emily’s] philosophy of building weird projects from the Supercon talk she presented. It’s also worth mentioning that this one fits right into the Reinvented Retro contest.
Most of us, if we have bought a single board computer with the capability to support a camera, will have succumbed to temptation and shelled out for that peripheral in the hope that we can coax our new toy into having sight. We’ll have played with the command line tool and taken a few random images of our bench, but then what? There is so much possibility in a camera that our colleague [Steven Dufresne] wanted to explore with his Raspberry Pi, so he built a motorised eyeball mount with which to do so.
Pan & tilt mounts using RC servos are nothing especially new, but in this one he’s put some design effort that maybe some of the others lack. A lot of effort has gone in to ensuring no interference between the two axes, and in a slightly macabre twist until you remember it’s a model he’s talking about, the unit has been designed to fit inside a human head.
The servos are driven from the Pi using a servo driver board he’s discussed in another video, so once he’s described the assembly with a few design tweaks thrown in he has a quick look at the software demo he’s written emulating neurons for eye tracking. He promises that will be put up somewhere for download in due course.
If you’re in the market for a pan & tilt mount for your Pi, this one could make a lot of sense to throw at your 3D printer. It’s certainly more accomplished than this previous one we’ve shown you.