“Necessity is the mother of invention,” or so the saying goes. We’ve never held to that, finding that laziness is a much more powerful creative lubricant. And this story about someone who automated their job with a script is one of the best examples of sloth-driven invention since the TV remote was introduced. If we take the story at face value — and it’s the Internet, so why wouldn’t we? — this is a little scary, as the anonymous employee was in charge of curating digital evidence submissions for a law firm. The job was to watch for new files in a local folder, manually copy them to a cloud server, and verify the file with a hash to prove it hasn’t been tampered with and support the chain of custody. The OP says this was literally the only task to perform, so we can’t really blame them for automating it with a script once COVID shutdowns and working from home provided the necessary cover. But still — when your entire job can be done by a Windows batch file and some PowerShell commands while you play video games, we’re going to go out on a limb and say you’re probably underemployed.
People have been bagging on the US Space Force ever since its inception in 2019, which we think is a little sad. It has to be hard being the newest military service, especially since it branched off of the previously newest military service, and no matter how important its mission may be, there’s still always going to be the double stigmas of being both the new kid on the block and the one with a reputation for digging science fiction. And now they’ve given the naysayers yet more to dunk on, with the unveiling of the official US Space Force service song. Every service branch has a song — yes, even the Army, and no, not that one — and they all sound appropriately martial. So does the Space Force song, but apparently people have a problem with it, which we really don’t get at all — it sounds fine to us.
Continue reading “Hackaday Links: October 2, 2022”
[Enza3D] shows off a surprisingly compact articulated animatronic eyeball that can be intuitively controlled with a Wii nunchuk controller. The design uses 3D printed parts and some tiny servos, and all of the necessary electronics can be easily purchased online. The mechanical design of the eye is very impressive, and [Enza3D] walks through several different versions of the design, the end result of which is a tidy little assembly that would fit nicely into masks, costumes, or other projects.
A Wii nunchuk is ideal for manual control of such a device, thanks to its ergonomic design and ease of interface (the nunchuk communicates over I2C, which is easily within the reach of even most modest of microcontrollers.) Of course, since driving servos is also almost trivial nowadays, it doesn’t look like working this into an automated project would pose much of a challenge.
The eyeball looks great, but if you want to try for yourself, accessing the design files and code will set you back $10 which might look attractive if an eye like this is the missing link for a project.
On the other hand, enjoying the video (embedded below) and getting ideas from [Enza3D]’s design notes will only cost you a few minutes.
Continue reading “Enjoy This Animatronic Eyeball’s Smooth Moves”
When a piece of hardware goes unsupported by a company, it can be frustrating. Bugs may no longer get fixed, or in the worst cases, perfectly good hardware can stop working entirely as software licences time out. Sadly, for a group reliant on retinal implants from company Second Sight, the company has since stopped producing and supporting the devices that give them a crude form of bionic sight.
The devices themselves consist of electrodes implanted into the retina, which can send signals to the nervous system which appear as spots of light to the user. A camera feed is used to capture images which are then translated into signals sent to the retinal electrodes. The results are low-resolution to say the least, and the vision supplied is crude, but it gives users that are blind a rudimentary sense that they never had before. It’s very much a visual equivalent to the cochlear implant technology.
The story is altogether too familiar; Second Sight Medical Products came out with a cutting-edge device, raised money and put it out into the world, only to go bankrupt down the road, leaving its users high and dry. Over 350 people have the implants fitted in one eye, while one Terry Byland is the sole person to have implants in both his left and right eyeballs. Performance of the device was mixed, with some users raving about the device while others questioned its utility.
Continue reading “Bionic Implants Can Go Obsolete And Unsupported, Too”
Typically, to improve one’s eyesight, we look to tools like corrective lenses or laser eye surgery to improve optical performance. However, [Casey Connor 2] came across another method, that uses light exposure to improve color vision, and set about trying to achieve the same results at home.
A recent study published in Nature showed that a single exposure to 670 nm light for 3 minutes lead to an improvement in color perception lasting up to a week. The causative method is that cones in the eye get worse at producing ATP as we age, and with less of this crucial molecule supplying energy to cells in the eye, our colour perception declines. Exposure to 670 nm light seems to cause mitochondria in the eye to produce more ATP in a rather complicated physical interaction.
For [Casey’s] build, LEDs were used to produce the required 670 nm red light, installed into ping pong balls that were glued onto a pair of sunglasses. After calculating the right exposure level and blasting light into the eyes regularly each morning, [Casey] plans on running a chromaticity test in the evenings with a custom Python script to measure color perception.
[Casey] shows a proper understanding of the scientific process, and has accounted for the cheap monitor and equipment used in the testing. The expectation is that it should be possible to show a relative positive or negative drift, even if the results may not be directly comparable to industry-grade measures.
We’re eager to see the results of [Casey]’s testing, and might even be tempted to replicate the experiment if it proves successful. We’ve explored some ocular topics in the past too, like the technology that goes into eyeglasses. Video after the break.
Continue reading “DIY Glasses Aim To Improve Color Vision”
[Ramin assadollahi] has been busy rebuilding and improving an Omnibot 5402, and the last piece of hardware he wanted to upgrade was some LED matrix eyes and a high quality Raspberry Pi camera for computer vision. An Omnibot was something most technical-minded youngsters remember drooling over in the 80s, and when [ramin] bought a couple of used units online, he went straight to the workbench to give the vintage machines some upgrades. After all, the Omnibot 5402 was pretty remarkable for its time, but is capable of much more with some modern hardware. One area that needed improvement was the eyes.
The eyes on the original Omnibot could light up, but that’s about all they were capable of. The first upgrade was installing two 8×8 LED matrix displays to form what [ramin] calls Minimal Expressive Eyes (MEE), powered by a Raspberry Pi. With the help of a 3D-printed adapter and some clever layout, the LED matrix displays fit behind the eye plate, maintaining the original look while opening loads of new output possibilities.
Adding a high quality Raspberry Pi camera with wide-angle lens was a bit more challenging and required and extra long camera ribbon connector, but with the lens nestled just below the eyes, the camera has a good view and isn’t particularly noticeable when the eyes are lit up. Having already upgraded the rest of the hardware, all that remains now is software work and we can’t wait to see the results.
Two short videos of the hardware are embedded below, be sure to give them a peek. And when you’re ready for more 80s-robot-upgrading-action, check out the Hero Jr.
Continue reading “Omnibot From The 80s Gets LED Matrix Eyes, Camera”
[markw2k9] has an Alexa device that sits in his kitchen and decided it was time to spruce it up with some rather uncanny eyes. With some inspiration from the Adafruit Uncanny Eyes project, which displays similar animated eyes, [markw2k9] designed a 3d printed shell that goes on top of a 2nd generation Amazon Echo. A teensy 3.2 powers two OLED displays and monitors the light ring to know when to turn the lights on and show that your smart speaker is listening. The eyes look around in a shifty sort of manner. Light from the echo’s LED ring is diffused through a piece of plexiglass that was lightly sanded on the outside ring and the eye lenses are 30mm cabochons (a glass lens often used for jewelry).
One hiccup is that the ring on the Echo will glow in a steady pattern when there’s a notification. As this would cause the OLEDs to be on almost continuously and concerned for the lifetime of the OLED panels, the decision was made to detect this condition in the state machine and go into a timeout state. With that issue solved, the whole thing came together nicely. Where this project really shines is the design and execution. The case is sleek PLA and the whole thing looks professional.
We’ve seen a few other projects inspired by the animated eyes project such as this Halloween themed robot that is honestly quite terrifying. The software and STL files for the smart speaker’s eyes are on Github and Thingiverse.
Continue reading “A Smart Speaker That Reminds You It’s Listening”
Some of us have computer mice with more buttons than we have fingers, resolution tracking finer than a naked eye can discern, and forced-air vents. All these features presuppose one thing; the user has a functioning hand. [Federico Runco] knows that amyotrophic lateral sclerosis, ALS, or Lou Gehrig’s disease, will rob a person of their ability to use standard computer inputs, or the joystick on a motorized wheelchair. He is building EyesDrive for the 2020 Hackaday Prize, to restore that mobility to ALS patients. There are already some solutions, but this one focuses on a short bill of materials.
Existing systems are expensive and often track pupil location, which returns precise data, but EyesDrive only discerns, left, right, and resting. For these, we need three non-invasive electrodes, a custom circuit board with amplifiers, signal processing circuits, and a microcontroller. He includes a Bluetooth socket on the custom PCBs, which is the primary communication method. In the video below he steers a virtual kart around a knotty course to prove that his system is up to the task of an urban wheelchair.
EyesDrive by [Federico Runco] should not be confused with the HackadayPrize2015 winner, Eyedrivomatic, lead by two remarkable hackers, Steve Evans and Patrick Joyce.
Continue reading “Karting Hands-Free”