[AlexPewPew] tipped us off on some interesting virtual reality work going on at the Swiss Federal Institute of Technology in Zurich. Mapping a user’s head movement to match the images shown in a head mounted display is something the Oculus Rift is very good at. But in order to walk and move around freely in that virtual environment requires completely different hardware. We’ve seen some ingenious setups before, but nothing as efficient as this.
In the video above, they have put sheets of bar-coded paper on the ceiling in a grid pattern. A camera that mounts on the users head looks up at the grid of papers and gets the user’s location. The neatest part though, is how they are fitting a large virtual space into a small room. As the user walks down a straight virtual path, software is slowly making the actual path in the small room curve. The end result is the user walks in circles in the small room, thinking he or she is exploring a much larger space. Neat stuff!
If you have a head mounted display lying around, and can’t think of anything to enter into The Hackaday Prize contest, this would be a great concept to work on. What are you waiting for…get hacking!
Thanks to [AlexPewPew] for the tip!
The most common way to put some sort of haptic feedback in an interface hasn’t changed much since the plug-in rumble pack for the Nintendo 64 controller – just put a pager motor in there and set it spinning when the user needs to feel something. This method takes a relatively long time to spin up, and even the very cool Steam controller with voice coiled directional pads can’t ‘stick’, or stay high or low to notify the user of something.
[Tim]’s day job is working with very fancy piezoelectric actuators, and when an opportunity came up to visit the Haptics symposium, he jumped at the chance to turn these actuators into some sort of interface. He ended up creating two devices: a two-piezo cellphone-sized device, and a mouse with a left click button that raises and lowers in response to the color of the mousepad.
The cellphone device contains two piezo actuators with a 10 gram weight epoxied on. A small microcontroller and piezo driver give this pseudo phone the smoothest vibrations functions you can imagine. The much more innovative color-sensing mouse has a single actuator glued to the left button, and a photosensor in the base. When the mouse rolls over a dark square on a piece of paper, the button raises. Rolling over a lighter area, the button lowers. It’s all very, very cool tech and something we’ll probably see from Apple, Microsoft, or Sony in a few years.
Videos of both devices below.
Continue reading “Piezos For Haptic Feedback”
By now you should be familiar with MAME arcade cabinets and their ability to emulate any classic arcade machine from the days of yore. PinMAME is a similar setup to reconstruct classic pinball machines on computer monitors, but its popularity is nothing compared to the machines that play everything from Galaga to The Simpson’s arcade game. We won’t speculate on the reasons for that, but we do know how to make pinball emulation awesome – you need to emulate the buzzing and 60 Hz hum of solenoids found in the original machines.
This project comes from [Brendan Schrader] of the Hive76 hackerspace in Philly. It gives emulated pinball machines the tactile and haptic feedback required for a proper PinMAME setup. Inside [Brendan]’s box are two monitors, one for the backglass and one for the playfield, and a small computer to run the PinMAME software.
Also in the box are a few transducers usually used to turn any flat solid surface into a speaker. [Brendan] sent the audio output from the pinball emulation to a set of speakers and the ‘mechanical sounds’ audio to the transducer mounted to the chassis. The difference between haptic feedback and no haptic feedback is amazing, and something every PinMAME setup desperately needs.
Unfortunately, [Brendan] says he lives a decade in the past and doesn’t do the whole interwebs and email thing. He tells us he’ll send in a build log in a week or so, and we’ll put that up when it comes in.
Continue reading “How to make PinMAME awesome”
Hackaday alum [Caleb Kraft] tipped us about a nice hack he got to see at the Open Hardware Summit this year. It is a flexible haptic strip made from a LED strip.
Cheap flexible printed circuit boards aren’t easy to find, so [Jacob] basically switched all the RGB LEDs of his strip with shaftless vibratory motors. The LEDs were addressed using WS2801 LED drivers so the hack also consisted in shorting the current feedback resistors. As a result, the motors will use as much current as the driver can give and [Jacob] can individually drive each motor. Luckily for him there already was an Arduino library called fastSPI to drive the strip, so he managed to make a nice haptic device in no time. In case you were wondering, the maximum number of motors you could drive is 32.
Our own [Eric Evenchick] also saw a lot of great project demos during his time at the OHS.
[via EE Times]
This little device is about the size of a webcam, and it perches on top of your computer monitor in much the same way. It’s Disney’s solution to haptic feedback for gestural input. That is to say, wave your hands in the air to control a computer, and this will give you some sense of actually touching the virtual objects.
The thing shoots toroids of air at the user. We thought the best example of how this is used is the soccer ball demo in the video. A game is being played where virtual soccer balls are launched toward the user. The rig shoots out a puff of air to go along with each ball. When you get your hand in the right place you’ll feel the vortex of air and know you’ve made contact with the virtual object.
On the hardware side this is just begging to be recreated in your basement. What we have here is a 3D printed enclosure that has six sides. Five of them have speaker elements that create pressure waves when given an electrical signal. When coordinated they cause a perfect ring vortex (think cigar smoke ring) to shoot out the flexible nozzle which can be aimed thanks to a gimbal setup. Of course the element that makes it interactive is a 3D camera, which could be a Kinect or Leap Motion when built in the home workshop.
Continue reading “Disney prototype adds haptic feedback to gestural interface”
Many people with hearing impairments have assistive devices at home that flash a light whenever a fire truck goes by, an alarm bell goes off, or the doorbell rings. With the exception of a hearing dog, these devices are useless outside the home, and this is where [Halley]’s Flutter dress comes into play. Flutter has microphones and microcontrollers sewn into the dress to listen to the surrounding environment and uses small vibration motors to wave small cloth leaflets whenever a loud sound is detected.
In the writeup for Flutter (PDF), [Halley] tells us she used a quartet of microcontrollers to detect the ambient acoustic environment. Each microcontroller passes the signal from the microphone into a buffer where it performs an FFT on the sound data. From this, the loudness and frequency of a noise – as well as the direction from a time-of-flight calculation – can be determined. Once that is complete, each microcontroller actuates a small vibrator motor in the dress’ leafs according to how loud and in which direction the sound came from.
As with all assistive technologies for the hearing impaired, there is always the aspect of deaf culture’s point of view that such inventions are seen as forcing a disability on someone. [Halley]’s Flutter dress was with the input of a few family members who have hearing impairments and got some positive feedback from members of the community. Good job, and we can see why it won Best in Show at the 2012 International Symposium on Wearable Computer’s Design Exhibition.
[Diego] wrote in to let us know about the haptic feedback arm project with which he’s hard at work. He calls it the Vimphin, which is uses the beginning letters from the words: Virtual Manipulator Physical Interface. Instead of a claw, the robot arm has a hand grip that lets you easily move it around. That is unless the virtual model of the arm encounters a dense substance, and then it’s going to be more difficult to move.
The test arm seen above includes several high quality robotic servo motors. You probably know that servo motors have feedback circuits that let them sense their position, and this is what is used to detect when a user moves the arm. This movement is tracked in the virtual 3D environment seen on the screen. In this case, the base of the robot is sitting in a pool of water. When the end of the virtual arm is in open air it’s pretty easy to move. When it dips below the water line the motors are used to increase resistance, simulating movement through a denser substance.
This sounds like a great piece of hardware to have around when the OASIS is finally developed.
Continue reading “Robot arm provides haptic feedback from the virtual world”