If you’re looking to build the next creepy Halloween decoration or simply thinking about trying out OpenCV for the first time, this next project will have you covered. [Glen] made a pair of giant googly eyes that follow you around the room using some servos and some very powerful software.
The project was documented in three parts. In Part 1, [Glen] models and builds the eyes themselves, including installing the servo motors that will eventually move them around. The second part involves an Arduino and power supply that will control the servos, and the third part goes over using OpenCV to track faces.
This part of the project is arguably the most interesting if you’re new to OpenCV; [Glen] uses this software package to recognize different faces. From there, the computer picks out the most prominent face and sends commands to the Arduino to move the eyes to the appropriate position. The project goes into great detail, from Arduino code to installing Ubuntu to running OpenCV for the first time!
We’ve featured some of [Glen]’s projects before, like his FPGA-driven LED wall, and it’s good to see he’s still making great things!
I used to have nightmares as a child that somebody would bring X.org’s Xeyes into the real word one day…
…I will now never be able to sleep again…
:thumbsup:
I’m sure you realized that they are using steppers and not servos here. I am assuming because servos are loud and expensive for one that would have enough strength to swing a big acrylic disk. I still would have used servos and linked the two eyes together utilizing that mechanical advantage to allow a standard servo.
The only instance I could see of using separate motors would be if the eyes are looking down one someone and need to come together when the person is directly underneath. Small sacrifice. I haven’t looked at it yet, but I bet the majority of the code is just paralleling the stepper movements anyway.
I could have used servos but I had a bunch of extra stepper motors sitting around the house. The acrylic discs are heavier than I thought they would be. You can get some pretty good off-balance washing machine action going if you rotate the pupils continuously at high enough speeds.
The motors can be set to individual positions but the Python script sends the same value for both motors to the Arduino. A bit more math in the Python script and they could function more like xeyes.
I figured as much. Always cheaper to use what you already have. And yeah, I would expect those acrylic disks are quite heavy and the force required is much higher when trying to swing them like that.
What was most interesting to me is the mechanism. One of my hobbies is animatronics. I am always looking for new ideas like this. It had never occurred to me to do eye movement like that. Obviously it wouldn’t work for everything but I could see it as being quick and easy in some designs.
But they can’t stare at you direct.
put the iris’ on a delta bot rig and it would
Hmm, a tiny ~4″ lead screw / linear actuator on the back of each pupil could solve that. I wonder what the smallest and lightest and thinnest available lead screw is? Probably not cheap though.
Just use threaded rod and nuts. I have done this down to 1mm (dia) for simple things.
That’d work.
use the mechanism from an old 3.5″ floppy?
Much like Google’s eyes!
Built something similar with a top hat for Halloween a couple years ago. Cut open the top, and glued 2 ping pong balls in the opening. glued disks to straightened paper clips than ran through the balls to a servo linkage. It just ran a servo pattern on an AVR to look left, right, center, jitter. But, the servo was loud. should’ve used a stepper….. No CV though. just a green led for backlighting.
This reminds me of my idea for a auto-tipping fedora, for my neck beard basement dweller costume.