[Enza3D] shows off a surprisingly compact articulated animatronic eyeball that can be intuitively controlled with a Wii nunchuk controller. The design uses 3D printed parts and some tiny servos, and all of the necessary electronics can be easily purchased online. The mechanical design of the eye is very impressive, and [Enza3D] walks through several different versions of the design, the end result of which is a tidy little assembly that would fit nicely into masks, costumes, or other projects.
A Wii nunchuk is ideal for manual control of such a device, thanks to its ergonomic design and ease of interface (the nunchuk communicates over I2C, which is easily within the reach of even most modest of microcontrollers.) Of course, since driving servos is also almost trivial nowadays, it doesn’t look like working this into an automated project would pose much of a challenge.
The devices themselves consist of electrodes implanted into the retina, which can send signals to the nervous system which appear as spots of light to the user. A camera feed is used to capture images which are then translated into signals sent to the retinal electrodes. The results are low-resolution to say the least, and the vision supplied is crude, but it gives users that are blind a rudimentary sense that they never had before. It’s very much a visual equivalent to the cochlear implant technology.
The story is altogether too familiar; Second Sight Medical Products came out with a cutting-edge device, raised money and put it out into the world, only to go bankrupt down the road, leaving its users high and dry. Over 350 people have the implants fitted in one eye, while one Terry Byland is the sole person to have implants in both his left and right eyeballs. Performance of the device was mixed, with some users raving about the device while others questioned its utility.
Eyeballs are often watching us, but they’re usually embedded in the skull of another human or animal. When they’re staring at you by themselves, they can be altogether more creepy. This Halloween project from [allpartscombined] aims to elicit that exact spooky vibe.
The project relies on a Kinect V2 to do body tracking. It feeds data to a Unity app that figures out how to aim the eyeball at any humans detected in the scene. The app sends angle data to an Arduino over serial, with the microcontroller generating the necessary signals to command servos which move the eyeball.
With tilt and pan servos fitted and the precision tracking from the Kinect data, the eye can be aimed at people in two dimensions. It’s significantly spookier than simply panning the eye back and forth.
The build was actually created by modifying an earlier project to create an airsoft turret, something we’ve seen a few times around these parts. Fundamentally, the tracking part is the same, just in this case, the eye doesn’t shoot at people… yet! Video after the break.
Wouldn’t it be nice if every webcam had a hardware switch? Especially for those built-in webcams like the one in your laptop. Since they don’t have switches yet, we’re just stuck trying to remember to turn them off or re-apply the sticker after every meeting. [Becky Stern] was tired of trying to remember to blind the all-seeing eye, and decided to make a robot companion that would do it for her.
Essentially, a servo-driven, 3D-printed eyelid covers the eye’s iris and also the web cam directly underneath. At first, we though [Becky] had liberated the business parts of a cheap webcam and built it into the eyeball, but this is far less intrusive. The eyeball simply sits atop the monitor, and [Becky] can control the eyelid two ways: she can set a timer with the potentiometer to close it automatically after some number of minutes, or else do it on demand using the momentary button. We’d love to see it tied directly to Zoom and or whatever else [Becky] uses regularly. Be sure to check out the build and demo video after the break to see it in action.
Like many of us, [Emily’s Electric Oddities] has had a lot of time for projects over the past year or so, including one that had been kicking around since late 2018. It all started at the Hackaday Superconference, when [Emily] encountered the Adafruit Hallowing board in the swag bag. Since that time, [Emily] has wanted to display the example code eyeball movement on a CRT, but didn’t really know how to go about it. Spoiler alert: it works now.
Eventually, [Emily] learned about the TV out library for Arduino and got everything working properly — the eyeball would move around with the joystick, blink when the button is pressed, and the pupil would respond visually to changes in ambient light. The only problem was that the animation moved at a lousy four frames per second. Well, until she got Hackaday’s own [Roger Cheng] involved.
[Roger] was able to streamline the code to align with [Emily]’s dreams, and then it was on to our favorite part of this build — the cabinet design. Since the TV out library is limited to black and white output without shades of gray, Emily took design cues from the late 70s/early 80s, particularly the yellow and wood of the classic PONG cabinet. We love it!
Most of us, if we have bought a single board computer with the capability to support a camera, will have succumbed to temptation and shelled out for that peripheral in the hope that we can coax our new toy into having sight. We’ll have played with the command line tool and taken a few random images of our bench, but then what? There is so much possibility in a camera that our colleague [Steven Dufresne] wanted to explore with his Raspberry Pi, so he built a motorised eyeball mount with which to do so.
Pan & tilt mounts using RC servos are nothing especially new, but in this one he’s put some design effort that maybe some of the others lack. A lot of effort has gone in to ensuring no interference between the two axes, and in a slightly macabre twist until you remember it’s a model he’s talking about, the unit has been designed to fit inside a human head.
The servos are driven from the Pi using a servo driver board he’s discussed in another video, so once he’s described the assembly with a few design tweaks thrown in he has a quick look at the software demo he’s written emulating neurons for eye tracking. He promises that will be put up somewhere for download in due course.
If you’re in the market for a pan & tilt mount for your Pi, this one could make a lot of sense to throw at your 3D printer. It’s certainly more accomplished than this previous one we’ve shown you.
This is an older project, but the electromechanical solution used to create this giant, staring eyeball is worth a peek. [Richard] and [Anton] needed a big, unblinking eyeball that could look in any direction and their solution even provides an adjustable pupil and iris size. Making the pupil dilate or contract on demand is a really nice feature, as well.
The huge fabric sphere is lit from the inside with a light bulb at the center, and the iris and pupil mechanism orbit the bulb like parts of an orrery. By keeping the bulb in the center and orbiting the blue gel (for the iris) and the opaque disk (for the pupil) around the bulb, the eye can appear to gaze in different directions. By adjusting the distance of the disks from the bulb, the size of the iris and pupil can be changed.
A camera system picks out objects (like people) and directs the eye to gaze at them. The system is clever, but the implementation is not perfect. As you can see in the short video embedded below, detection of a person walking by lags badly. Also, there are oscillations present in the motion of the iris and pupil. Still, as a mechanism it’s a beauty.