Virtual reality is a slow-moving field in some respects. While a lot of focus is put on optical technologies and headsets, there’s a lot more involved when it comes to believably placing a human being in a virtual environment. So far, we’ve gotten a good start at the visuals and head tracking, but interaction technology is still lagging behind a lot. [Lucas] is working in just that area, iterating heavily on his homebrew VR gloves.
The gloves consists of potentiometers, fitted with spools and attached to the tip of each digit on a wearer’s hand by a string. As the user curls their fingers, the potentiometers turn and the position of the fingers can be measured. The potentiometers are all read via an Arduino, which communicates back to a PC via USB. A custom driver is then used to interact with Valve’s SteamVR software, allowing the glove to be used with a wide variety of existing software.
Thus far, the system is merely tracking finger position, but the spool and string based design is intended to support motors down the line for each finger to create resistance, so the user can gain the feeling of touching and interacting with virtual objects. The project has the potential to be a cheaper, more accessible alternative than current off-the-shelf solutions. We’ve seen other hand-tracking gloves before, too – though none that track the spread of a wearer’s hand as well as the finger extension. If you’re working on precisely that, please do drop us a line. Video after the break.
Continue reading “Virtual Reality Gloves Aim To Improve Interactivity”
By this point, pretty much everyone has come across a word clock project, if not built one themselves. There’s just an appeal to looking at a clock and seeing the time in a more human form than mere digits on a face. But there are senses beyond sight. Have you ever heard a word clock? Have you ever felt a word clock? These are questions to which Hackaday’s own [Moritz Sivers] can now answer yes, because he’s gone through the extreme learning process involved in designing and building a haptic word clock driven with the power of magnets.
Individual letters of the display are actuated by a matrix of magnetic coils on custom PCBs. These work in a vaguely similar fashion to LED matrices, except they generate magnetic fields that can push or pull on a magnet instead of generating light. As such, there are a variety of different challenges to be tackled: from coil design, to driving the increased power consumption, to even considering how coils interact with their neighbors. Inspired by research on other haptic displays, [Moritz] used ferrous foil to make the magnets latch into place. This way, each letter will stay in its forward or back position without powering the coil to hold it there. Plus the letter remains more stable while nearby coils are activated.
Part of the fun of “ubiquitous” projects like word clocks is seeing how creative hackers can get to make their own creations stand out. Whether it’s a miniaturized version of classic designs or something simple and clean, we love to see them all. Unsurprisingly, [Moritz] himself has impressed us with his unique take on word clocks in the past. (Editor’s note: that’s nothing compared to his cloud chambers!)
Check out the video below to see this display’s actuation in action. We’re absolutely in love with the satisfying *click* the magnets make as they latch into place.
Continue reading “The Word Clock You Can Feel”
Rumble first hit the gaming mainstream back in the mid-1990s, and has become de rigeur for console players using gamepads ever since. It’s less prevalent on the PC, because most players rely on keyboards and mice that are rumble-free. However, innovation is possible, and [ilge] put together a modified mouse for shooters that has an excellent recoil feedback device.
The feedback effect is run by an Arduino, which receives serial data from a Python program running on the host computer. When the mouse is clicked, the Python program notifies the Arduino, which then fires a bank of four solenoids repeatedly back-and-forth to generate the feedback effect. The solenoids are triggered by a relay, which is an easy way to switch such a load, though we suspect it may not hold up well over time due to the rapid fire rate and the likelihood of spark damage over time from high inrush current to the solenoids.
It’s a simple build that nonetheless adds a great haptic feedback effect to the otherwise humble computer mouse. While we don’t expect to see pros using the device anytime soon, it’s a great concept that does add to the shooter experience. Similar hardware could likely be put to great use in a VR context, too. The state of the art of haptic technology continues to move at a rapid pace, and we can’t wait to see what comes next. Video after the break.
Continue reading “A Gaming Mouse With Recoil Feedback”
Controlling a single drone takes up a considerable amount of concentration and normally involves wearing silly goggles. It only gets harder if you want to control a swarm. Researchers at Skolkovo Institute of Technology decided Jedi mind tricks were the best way, and set up swarm control using hand gestures.
We’ve seen something similar at the Intel Booth of the 2016 Makerfaire. In that demo, a single drone was controlled by hand gesture using a hacked Nintendo Power Glove. The Skoltech approach has a lot of innovation building on that concept. For one, haptics in the finger tips of the glove provide feedback from the current behavior of the drones. Through their research they found that most operators quickly learned to interpret the vibrations subconsciously.
It also increased the safety of the swarm, which is a prime factor in making these technologies usable outside of the lab. Most of us have at one point frantically typed commands into a terminal or pulled cords to keep a project from destroying itself or behaving dangerously. Having an intuitive control means that an operator can react quickly to changes in the swarm behavior.
The biggest advantage, which can be seen in the video after the break, is that the hand control eliminates much of the preprogramming of paths that is currently common in swarm robotics. With tech like this we can imagine a person quickly being trained on drone swarms and then using them to do things like construction surveys with ease. As an added bonus the researchers were nice enough to pre-submit their paper to arxiv if any readers would like to get into the specifics.
Continue reading “Use Jedi Mind Tricks To Control Your Next Drone Swarm”
Picture this: You’re in your bed in the middle of the night, and you want to know what time it is. Bedside alarm clocks are a thing of the past and now you rely on your smartphone to tell the time. Only, if you turned the screen on, you’d find that looking at it in the dark is tantamount to staring at the sun without eye protection. [Michael] pictured the same thing and his solution for this scenario is a clever haptic-feedback clock.
The idea behind it is simple, a clock from which you can tell the time without having to use your eyes. This one gives you two options for that, the first one being a series of haptic pulses that let you tell the time simply by touching the device. The second, audibly telling the time with voice samples stored in a flash chip, was added in the second revision as [Michael] continues to refine his design. In addition to helping us assess the time in the dark, it’s also worth noting that this could be useful for those with visual impairments as well.
Until we can see the final product, you can help him out looking over the designs and sending pull requests over at the project’s GitHub page, or just watch his progress in the Hackaday.io page. We’ve seen some interesting ways to tell the time before, from a game of Tetris to a clock housed inside the shell of an old-school camera flash, but we’ve never seen one that uses haptic feedback before. We hope for the sake of our eyes that it catches on!
A question: if you’re controlling the classic video game Street Fighter with gestures, aren’t you just, you know, street fighting?
That’s a question [Charlie Gerard] is going to have to tackle should her AI gesture-recognition controller experiments take off. [Charlie] put together the game controller to learn more about the dark arts of machine learning in a fun and engaging way.
The controller consists of a battery-powered Arduino MKR1000 with WiFi and an MPU6050 accelerometer. Held in the hand, the controller streams accelerometer data to an external PC, capturing the characteristics of the motion. [Charlie] trained three different moves – a punch, an uppercut, and the dreaded Hadouken – and captured hundreds of examples of each. The raw data was massaged, converted to Tensors, and used to train a model for the three moves. Initial tests seem to work well. [Charlie] also made an online version that captures motion from your smartphone. The demo is explained in the video below; sadly, we couldn’t get more than three Hadoukens in before crashing it.
With most machine learning project seeming to concentrate on telling cats from dogs, this is a refreshing change. We’re seeing lots of offbeat machine learning projects these days, from cryptocurrency wallet attacks to a semi-creepy workout-monitoring gym camera.
Continue reading “Arduino, Accelerometer, And TensorFlow Make You A Real-World Street Fighter”
The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.
You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.
This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.
Continue reading “Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer”