Video games are a great way to relax, and sometimes get your heart rate up at the same time. But unless you’re playing something like Dance Dance Revolution, the controls pretty much always require the use of both hands. Even the old Atari controller benefited from using the other hand for support.
But what if you don’t have the use of both hands? Or you have a repetitive stress injury? Or you just want to eat cheese curls with chopsticks while you play? [Akaki Kuumeri] has you covered with one of the hands-down greatest uses for 3D printing we’ve seen — a PlayStation DualShock 4 controller modified for one-handed use. If this looks familiar, it may be because [Akaki] made a PS5 controller version a while back, but who can get one of those, anyway?
Though [Akaki] does most of the demonstrating in the video below with their left hand, they were cool enough to make a right-handed version as well. In the left-handed version, the symbol buttons and right trigger are actuated with the left hand, and the right joystick is used by moving the whole controller against your leg, the table, the arm of the couch, or whatever you wish.
[Akaki] even designed some optional pieces, including a leg strap. The right-hand version of course does the D-pad instead. But what should the order of the arrow buttons be? After much contemplation, [Akaki] settled on the standard DDR configuration of ←↓↑→.
We love that the symbols are made from raw filament pressed into grooves, and think it’s totally awesome that this is made to be attached to the controller and removed with one hand. Check out the video below to see it in action with a handful of games.
Continue reading “The Coolest Controller Mod, Hands Down” →
If we count all the screens in our lives, it takes a hot minute. Some of them are touchscreens, some need a mouse or keyboard, but we are accustomed to all the input devices. Not everyone can use the various methods, like cerebral palsy patients who rely on eye-tracking hardware. Traditionally, that only works on the connected computer, so switching from a chair-mounted screen to a tablet on the desk is not an option. To give folks the ability to control different computers effortlessly [Zack Freedman] is developing a head-mounted eye-tracker that is not tied to one computer. In a way, this is like a KVM switch, but way more futuristic. [Tony Stark] would be proud.
An infrared detector on the headset identifies compatible screens in line of sight and synchs up with its associated HID dongle. A headset-mounted color camera tracks the head position in relation to the screen while an IR camera scans the eye to calculate where the user is focusing. All the technology here is proven, but this new recipe could be a game-changer to anyone who has trouble with the traditional keyboard, mouse, and touchscreen. Maybe QR codes could assist the screen identification and orientation like how a Wii remote and sensor bar work together.
Some of us have computer mice with more buttons than we have fingers, resolution tracking finer than a naked eye can discern, and forced-air vents. All these features presuppose one thing; the user has a functioning hand. [Federico Runco] knows that amyotrophic lateral sclerosis, ALS, or Lou Gehrig’s disease, will rob a person of their ability to use standard computer inputs, or the joystick on a motorized wheelchair. He is building EyesDrive for the 2020 Hackaday Prize, to restore that mobility to ALS patients. There are already some solutions, but this one focuses on a short bill of materials.
Existing systems are expensive and often track pupil location, which returns precise data, but EyesDrive only discerns, left, right, and resting. For these, we need three non-invasive electrodes, a custom circuit board with amplifiers, signal processing circuits, and a microcontroller. He includes a Bluetooth socket on the custom PCBs, which is the primary communication method. In the video below he steers a virtual kart around a knotty course to prove that his system is up to the task of an urban wheelchair.
EyesDrive by [Federico Runco] should not be confused with the HackadayPrize2015 winner, Eyedrivomatic, lead by two remarkable hackers, Steve Evans and Patrick Joyce.
Continue reading “Karting Hands-Free” →
Assistive devices for people with disabilities can make an inestimable difference to their lives, but with a combination of technology, complexity, and often one-off builds for individual needs, they can be eye-wateringly expensive. When the recipient is a young person who may grow out of more than one device as they mature, this cost can be prohibitive. Some way to cut down on the expense is called for, and [Phil Malone] has identified the readily available hoverboard as a possible source of motive power for devices that need it.
Aside from being a children’s toy, hoverboards have been well and truly hacked; we’ve featured them in Hacky Racers, and as hacker camp transport. But this is an application which demands controllability and finesse not needed when careering round a dusty field. He’s taken that work and built upon it to produce a firmware that he calls HUGS, designed to make the hoverboard motors precisely controllable. It’s a departure from the norm in hoverboard hacking, but perhaps it can open up new vistas in the use of these versatile components.
There is much our community can do when it comes to improving access to assistive technologies, and we hope that this project can be one of the success stories. We would however caution every reader to avoid falling into the engineer savior trap.
It’s something that can happen to all of us, that we forget things. Young and old, we know things are on our to-do list but in the heat of the moment they disappear from our minds and we miss them. There are a myriad of technological answers to this in the form of reminders and calendars, but [Nick Bild] has come up with possibly the most inventive yet. His Newrons project is a pair of glasses with a machine vision camera, that flashes a light when it detects an object in its field of view associated with a calendar entry.
At its heart is a JeVois A33 Smart Machine Vision Camera, which runs a neural network trained on an image dataset. It passes its sightings to an Arduino Nano IoT fitted with a real-time clock, that pulls appointment information from Google Calendar and flashes the LED when it detects a match between object and event. His example which we’ve placed below the break is a pill bottle triggering a reminder to take the pills.
We like this idea, but can’t help thinking that it has a flaw in that the reminder relies on the object moving into view. A version that tied this in with more conventional reminding based upon the calendar would address this, and perhaps save the forgetful a few problems.
Continue reading “Assistive Specs Help Jog Your Memory” →
This tip comes our way courtesy of [Elad Orbach], who’s been experimenting with a device that uses a servo to turn the function dial on a multimeter. It’s something you can put together in a few minutes with leftovers from the parts bin, and as you can see in the video after the break, the basic concept seems to be sound enough.
As to finding a practical reason for spinning the switch on your meter with a servo, that’s left largely as an exercise for the reader. [Elad] hints at the possibility of using such a setup to help automate repetitive testing, which we could see being useful especially in combination with a foot pedal that allows you to switch modes without having to put the probes down. The same basic idea could also be helpful as an assistive device for those who have difficulty grasping or limited dexterity.
Whether top of the line or bottom of the barrel, the multimeter is easily the hardware hacker’s most frequently used tool (beyond the screwdriver, perhaps). We’ve seen plenty of projects that try to graft additional features onto this common gadgets, though automation isn’t usually among them.
Continue reading “This Servo Actuated Multimeter Does The Twist” →
For their final project in embedded microcontroller class, [Aaheli, Jun, and Naomi] turned their focus toward assistive technology and created an Electronic Travel Aid (ETA) for the visually impaired that uses haptic feedback to report the presence of obstacles.
We have seen a few of these types of devices in the past, and they almost always use ultrasonic sensors to gauge distance. Not so with this ETA; it uses six VL53L0X time-of-flight (ToF) sensors mounted at slightly different angles from each other, which provides a wide sensing map. It is capable of detecting objects in a one-meter-wide swath at a range of one meter from the sensors.
The device consists of two parts, a wayfinding wand and a feedback module. The six ToF sensors are strapped across the end of a flashlight body and wired to an Arduino Mini inside the body. The Mini receives the sensor data over UART and sends it to the requisite PIC32, which is attached to a sleeve on the user’s forearm. The PIC decodes these UART signals into PWM and lights up six corresponding vibrating disc motors that dangle from the sleeve and form a sensory cuff bracelet around the upper forearm.
We like the use of ToF over ultrasonic for wayfinding. Whether ToF is faster or not, the footprint is much smaller, so its more practical for discreet assistive wearables. Plus, you know, lasers. You can see how well it works in the demo video after the break.
This device is intended to augment the traditional white cane, not replace it. This virtual cane we saw a few years ago is another story.
Continue reading “Find Your Way With Tiny Laser Beams” →