Do you have a spare mobility scooter sitting unused in your garage? Or, maybe you’ve got a grandmother who has been complaining about how long it takes her to get to bingo on Tuesdays? Has your local supermarket hired you to improve grocery shopping efficiency between 10am and 2pm? If you answered “yes” to any of those questions, then the guys over at Photon Induction have an “overclocked” mobility scooter build which should provide you with both inspiration and laughs.
They’ve taken the kind of inexpensive mobility scooter that can be found on Craigslist for a couple hundred dollars, and increased the battery output voltage to simultaneously improve performance and reduce safety. Their particular scooter normally runs on 24V, and all they had to do to drastically increase the driving speed was move that up to 60V (72V ended up burning up the motors).
Other than increasing the battery output voltage, only a couple of other small hacks were necessary to finish the build. Normally, the scooter uses a clutch to provide a gentle start. However, the clutch wasn’t up to the task of handling 60V, so the ignition switch was modified to fully engage the clutch before power is applied. The horn button was then used as the accelerator, which simply engages a solenoid with massive contacts that can handle 60V. The result is a scooter that is bound to terrify your grandmother, but which will get her to bingo in record time.
Continue reading “Making a Mobility Scooter Drastically More Mobile”
There is a lot of helpful technology for people with mobility issues. Even something that can help people do something most of us wouldn’t think twice about, like turn on a lamp or control a computer, can make a world of difference to someone who can’t move around as easily. Luckily, [Matt] has been working on using webcams and depth cameras to allow someone to do just that.
[Matt] found that using webcams instead of depth cameras (like the Kinect) tends to be less obtrusive but are limited in their ability to distinguish individual users and, of course, don’t have the same 3D capability. With either technology, though, the software implementation is similar. The camera can detect head motion and control software accordingly by emulating keystrokes. The depth cameras are a little more user-friendly, though, and allow users to move in whichever way feels comfortable for them.
This isn’t the first time something like a Kinect has been used to track motion, but for [Matt] and his work at Beaumont College it has been an important area of ongoing research. It’s especially helpful since the campus has many things on network switches (like lamps) so this software can be used to help people interact much more easily with the physical world. This project could be very useful to anyone curious about tracking motion, even if they’re not using it for mobility reasons. Continue reading “Head Gesture Tracking Helps Limited Mobility Students”
The use of brainwaves as control parameters for electronic systems is becoming quite widespread. The types of signals that we have access to are still quite primitive compared to what we might aspire to in our cyberpunk fantasies, but they’re a step in the right direction.
A very tempting aspect of accessing brain signals is that it can be used to circumvent physical limitations. [Jerkey] demonstrates this with his DIY brain-controlled electric wheelchair that can move people who wouldn’t otherwise have the capacity to operate joystick controls. The approach is direct, using a laptop to marshall EEG data which is passed to an arduino that simulates joystick operations for the control board of the wheelchair. From experience we know that it can be difficult to control EEGs off-the-bat, and [Jerky]’s warnings at the beginning of the instructable about having a spotter with their finger on the “off” switch should well be followed. Maybe some automated collision avoidance would be useful to include.
We’ve covered voice-operated wheelchairs before, and we’d like to know how the two types of control would stack up against one another. EEGs are more immediate than speech, but we imagine that they’re harder to control.
It would be interesting albeit somewhat trivial to see an extension of [Jerkey]’s technique as a way to control an ROV like Oberon, although depending on the faculties of the operator the speech control could be difficult (would that make it more convincing as an alien robot diplomat?).
Researchers at the University of Delaware are helping disabled kids by designing robot transportation for them. Exploring one’s environment is an important part of early development. Disabilities that limit mobility can prevent young children from experiencing this. Typically children are not offered a powered wheelchair until they are five or six years old, but adding intelligent technologies, like those found in the UD1, makes this possible at a much younger age. Proximity sensors all around the drive unit of the robot add obstacle avoidance and ensure safety when used around other children. When confronted with an obstacle the UD1 will stop, or navigate around it. The unit is controlled by a joystick in front of the rider but it can also be overridden remotely by a teacher, parent, or caregiver.
[via Robot Gossip]