Consider the complexity of the appendages sitting at the end of your arms. The human hands contain over a quarter of the entire complement of bones in the body, use dozens of muscles both in the hand itself and extending up the forearm, and are capable of
almost infinite variance in the movements they can create. They are exquisite machines.
And yet when it comes to virtual reality, most simulations treat the hands like inert blobs. That may be partly due to their complexity; doing motion capture from so many joints can be computationally challenging. But this pressure-sensitive hand motion capture rig aims to change that. The product of an undergraduate project by [Leslie], [Hunter], and [Matthew], the idea was to provide an economical and effective way to capture gestures for virtual reality simulators, which generally focus on capturing large motions from the whole body.
The sensor consists of a sandwich of polyurethane foam with strain gauge sensors embedded within. The user slips his or her hand into the foam and rests the fingers on the sensors. A Teensy and twenty lines of code translate finger motions within the sandwich into five axes of joystick movement, which is then sent to Unreal Engine, where finger motions were translated to a 3D-model of a hand to play a VR game of “Rock, Paper, Scissors.”
[Leslie] and her colleagues have a way to go on this; testers complained that the flat hand posture was unnatural, and that the foam heated things up quickly. Maybe something more along the lines of these gesture-capturing gloves would work?
What makes you afraid? Not like jump-scares in movies or the rush of a roller-coaster, but what are your legitimate fears that qualify as phobias? Spiders? Clowns? Blood? Flying? Researchers at The University of Texas at Austin are experimenting with exposure therapy in virtual reality to help people manage their fears. For some phobias, like arachnophobia, the fear of spiders, this seems like a perfect fit. If you are certain that you are safely in a spider-free laboratory wearing a VR headset, and you see a giant spider crawling across your field of vision, the fear may be more manageable than being asked to put your hand into a populated spider tank.
After the experimental therapy, participants were asked to take the spider tank challenge. Subjects who were not shown VR spiders were less enthusiastic about keeping their hands in the tank. This is not definitive proof, but it is a promising start.
High-end VR equipment and homemade rigs are in the budget for many gamers and hackers, and our archives are an indication of how much the cutting-edge crowd loves immersive VR. We have been hacking 360 recording for nearly a decade, long before 360 cameras took their niche in the consumer market. Maybe when this concept is proven out a bit more, implementations will start appearing in our tip lines with hackers who helped their friends get over their fears.
Via IEEE Spectrum.
Photo by Wokandapix.
It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.
Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.
The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.
In first-person games, an effective way to heighten immersion is to give the player a sense of impact and force by figuratively shaking the camera. That’s a tried and true practice for FPS games played on a monitor, but to [Zulubo]’s knowledge, no one has implemented traditional screen shake in a VR title because it would be a sure way to trigger motion sickness. Unsatisfied with that limitation, some clever experimentation led [Zulubo] to a method of doing screen shake in VR that doesn’t cause any of the usual problems.
Screen shake doesn’t translate well to VR because the traditional method is to shake the player’s entire view. This works fine when viewed on a monitor, but in VR the brain interprets the visual cue as evidence that one’s head and eyeballs are physically shaking while the vestibular system is reporting nothing of the sort. This kind of sensory mismatch leads to motion sickness in most people.
The key to getting the essence of a screen shake without any of the motion sickness baggage turned out to be a mix of two things. First, the shake is restricted to peripheral vision only. Second, it is restricted to an “in and out” motion, with no tilting or twisting. The result is a conveyance of concussion and impact that doesn’t rely on shaking the player’s view, at least not in a way that leads to motion sickness. It’s the product of some clever experimentation to solve a problem, and freely downloadable for use by anyone who may be interested.
Speaking of fooling one’s senses in VR environments, here is a fascinating method of simulating zero gravity: waterproof the VR headset and go underwater.
Virtual Reality (VR) and actual reality often don’t mix: watch someone play a VR game without seeing what they see and you see a lot of pointless-looking flailing around. [Nerdaxic] may have found a balance that works in this flight sim setup that mixes VR and AR, though. He did this by combining the virtual cockpit controls of his fight simulator with real buttons, knobs, and dials. He uses an HTC Vive headset and a beefy PC to create the virtual side, which is mirrored with a real-world version. So, the virtual yoke is matched with a real one. The same is true of all of the controls, thanks to a home-made control panel that features all of the physical controls of a Cessna 172 Skyhawk.
[Nerdaxic] has released the plans for the project, including his 3D printable knobs for throttle and fuel/air mixture and the design for the wooden panel and assembly that holds all of the controls in the same place as they are in the real thing. He even put a fan in the system to produce a gentle breeze to enhance the feel of sticking your head out of the window — just don’t try that on a real aircraft.
Continue reading “Home Built Flight Sim Combines Virtual and Actual Reality”
Someday Elon Musk might manage to pack enough of us lowly serfs into one of his super rockets that we can actually afford a ticket to space, but until then our options for experiencing weightlessness are pretty limited. Even if you’ll settle for a ride on one of the so-called “Vomit Comet” reduced-gravity planes, you’ll have to surrender a decent chunk of change, and as the name implies, potentially your lunch as well. Is there no recourse for the hacker that wants to get a taste of the astronaut experience without a NASA-sized budget?
Well, if you’re willing to get wet, [spiritplumber] might have the answer for you. Using a few 3D printed components he’s designed, it’s possible to use Google Cardboard compatible virtual reality software from the comfort of your own pool. With Cardboard providing the visuals and the water keeping you buoyant, the end result is something not entirely unlike weightlessly flying around virtual environments.
To construct his underwater VR headset, [spiritplumber] uses a number of off-the-shelf products. The main “Cardboard” headset itself is the common plastic style that you can probably find in the clearance section of whatever Big Box retailer is convenient for you, and the waterproof bag that holds the phone can be obtained cheaply online. You’ll also need a pair of swimmers goggles to keep water from rudely interrupting your wide-eyed wonderment. As for the custom printed parts, a frame keeps the waterproof bag from pressing against the screen while submerged, and a large spacer is required to get the phone at the appropriate distance from the operator’s eyes.
To put his creation to the test, [spiritplumber] loads up a VR rendition of NASA’s Neutral Buoyancy Laboratory, where astronauts experience a near-weightless environment underwater. All that’s left to complete the experience is a DIY scuba regulator so you can stay submerged. Though at that point we wouldn’t be surprised if a passerby confuses your DIY space simulator for an elaborate suicide attempt.
Continue reading “Underwater VR Offers Zero Gravity on a Budget”
It turns out that there were a few challenges to work around and a few new problems to solve, not least of which was mapping VR controllers to control an N64 game in a sensible way. One thing that wasn’t avoidable is that the N64’s rendered world may now pop in 3D, but it still springs forth from a rectangular stage. The N64, after all, is still only rendering a world in a TV-screen-sized portion; anything outside that rectangular window doesn’t really exist, and there’s no way around it as long an emulated N64 is running the show. Still, the result is impressive, and a video demo is embedded below where you can see the effect for yourself.
Continue reading “N64 Emulated in VR Makes Hyrule go 3D”