[Will] wanted to build some animatronic eyes that didn’t require high-precision 3D printing. He wound up with a forgiving design that uses an Arduino and six servo motors. You can see the video of the eyes moving around in the video below.
The bill of materials is pretty simple and features an Arduino, a driver board, and a joystick. The 3D printing parts are easy to print with no supports, and will work with PLA. Other than opening up holes there wasn’t much post-processing required, though he did sand the actual eyeballs which sounds painful.
There are so many important design decisions behind a robot: battery, means of locomotion, and position sensing, to name a few. But at a library in Helsinki, one of the most surprising design features for a librarian’s assistant robot was googly eyes. A company called Futurice built a robot for the Oodi library and found that googly eyes were a very important component.
The eyes are not to help the robot see, because of course they aren’t functional — at least not in that way. However without the eyes, robot designers found that people had trouble relating to the service robot. In addition, the robot needed emotions that it could show using the eyes and various sounds along with motion. This was inspired, apparently, by Disney’s rules for animation. In particular, the eyes would fit the rule of “exaggeration.” The robot could look bored when it had no task, excited when it was helping people, and unhappy when people were not being cooperative.
The build relies on special contact lenses, which [Kyle] suggests are best sourced by searching for “electric blue contact lenses”. These glow in the presence of UV light, which here is provided by a strip of UV LEDs embedded into Thor’s helmet from the recent Marvel movies.
The concept is simple, but the attention to detail is what makes this project a winner. Not content with an earlier build that was a tangle of wires and uncomfortable to use, [KyleofAsgard] made some smart upgrades. The battery for the LEDs and all circuitry is built into the helmet, making it easy to take on and off on those long convention days. For a more impressive effect, a relay is used to turn the LEDs on by remote control with a 433MHz module. This allows [Kyle] or an assistant to trigger the effect covertly, adding plenty of drama when the eyes suddenly begin to shine. It’s all done with off-the-shelf parts that even a novice could put together.
Unless you have an incredibly well-stocked parts bin, it’s probably too late to build these spooky animated eyes to scare off the neighborhood kiddies this year. But next year…
It’s pretty clear that Halloween decorating has gone over the top recently. It may not be as extreme as some Christmas displays, but plenty of folks like to up the scare-factor, and [wermy] seems to number himself among those with the spirit of the season. Like Christmas lights, these eyes are deployed as a string, but rather than just blink lights, they blink creepy eyes from various kinds of creatures. The eyes are displayed on individual backlit TFT-LCD displays housed in 3D-printed enclosures. Two pairs of eyes can be driven by the SPI interface of one ItsyBitsy M0 Express; driving more displays works, but the frame rate drops to an unacceptable level if you stretch it too far. Strung together on scraps of black ethernet cable, the peepers can live in the shrubs next to the front door or lining the walk, and with surprisingly modest power needs, you’ll get a full night of frights from a USB battery bank.
We like the look of these, and maybe we’ll do something about it next year. If you’re still in the mood to scare and don’t have the time for animated eyes this year, try these simple Arduino blinky eyes for a quick hit.
This may come as a shock, but some of those hot screaming deals on China-sourced gadgets and goodies are not all they appear. After you plunk down your pittance and wait a few weeks for the package to arrive, you just might find that you didn’t get exactly what you thought you ordered. Or worse, you may get a product with unwanted bugs features, like some green lasers that also emit strongly in the infrared wavelengths.
Sure, getting a free death ray in addition to your green laser sounds like a bargain, but as [Brainiac75] points out, it actually represents a dangerous situation. He knows whereof he speaks, having done a thorough exploration of a wide range of cheap (and not so cheap) lasers in the video below. He explains that the paradox of an ostensibly monochromatic source emitting two distinct wavelengths comes from the IR laser at the heart of the diode-pumped solid state (DPSS) laser inside the pointer. The process is only about 48% efficient, meaning that IR leaks out along with the green light. The better quality DPSS laser pointers include a quality IR filter to remove it; cheaper ones often fail to include this essential safety feature. What wavelengths you’re working with are critical to protecting your eyes; indeed, the first viewer comment in the video is from someone who seared his retina with a cheap green laser while wearing goggles only meant to block the higher frequency light.
[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.
The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.
Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.
Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.
The effect itself is simple – just two glowing orange LEDs spaced the right distance apart, placed in the highest window in the house. As every young child knows, the attic is almost the spookiest room in the house, second only to the basement.
Various effects were programmed in to the Arduino running the show, like breathing and blinking effects, to give that frightful character. For maintenance and programming purposes, [tdragger] wanted to have the Arduino remotely mounted, and searched for a solution. Rather than leaning on a wireless setup or something modern and off-the-shelf, instead some old RJ11 telephone extension cables were pressed into service. These allowed the eyes to be placed in the window, allowing the Arduino to be placed in a more accessible location.