The people at Two Bit Circus are at it again; this time with a futuristic racing simulator where the user controls the experience. It was developed by [Brent Bushnell] and [Eric Gradman] along with a handful of engineers and designers in Los Angeles, California. The immersive gaming chair utilized an actual racing seat in the design, and foot petals were added to give the driver more of a feeling like they were actually in a real race. Cooling fans were placed on top for haptic feedback and a Microsoft Kinect was integrated into the system as well to detect hand gestures that would control what was placed on the various screens.
The team completed the project within in thirty days during a challenge from Best Buy who wanted to see if they could create the future of viewing experiences. Problems surfaced throughout the time frame though creating obstacles surrounding the video cards, monitors, and shipping dates. They got it done and are looking towards integrating their work into restaurants like Dave & Buster’s and other facilities like arcades and bars (at least that’s the rumor going around town). The 5 part mini-series that was produced around this device can be seen after the break:
Continue reading “Custom Racing Chair with a Kinect and Haptic Feedback”
Ever feel like someone is watching you? Like, somewhere in the back of your mind, you can feel the peering eyes of something glancing at you? Tapping into that paranoia, is this Computer Science graduate project that was created during a “Tangible Interactive Computing” class at the University of Maryland by two bright young students named [Josh
As you’ve probably gathered from the title, this project uses a Microsoft Kinect to track the movement of nearby people. The output is then translated into actionable controls of the mounted eyeballs producing a creepy vibe radiating out from the feline, robot poster.
Continue reading “Creepy Cat Eyes with a Microsoft Kinect”
On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.
This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.
Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.
An earlier (un-smoothed) version of the full system can be seen after the break:
Continue reading “Cutting Ribbons with Robots and a Oculus Rift”
The Kinect has long been able to create realistic 3D models of real, physical spaces. Combining these Kinect-mapped spaces with an Oculus Rift is something brand new entirely.
[Thomas] and his fellow compatriots within the Kintinuous project are modeling an office space with the old XBox 360 Kinect’s RGB+D sensors. then using an Oculus Rift to inhabit that space. They’re not using the internal IMU in the Oculus to position the camera in the virtual space, either: they’re using live depth sensing from the Kinect to feed the Rift screens.
While Kintinuous is very, very good at mapping large-scale spaces, the software itself if locked up behind some copyright concerns the authors and devs don’t have control over. This doesn’t mean the techniques behind Kintinuous are locked up, however: anyone is free to read the papers (here’s one, and another, PDF of course) and re-implement Kintinuous as an open source project. That’s something that would be really cool, and we’d encourage anyone with a bit of experience with point clouds to give it a shot.
Continue reading “Virtual Physical Reality With Kintinuous And An Oculus Rift”
Here’s a really clever use for the Oculus Rift — Ecstatic Computation, a virtual reality spirit journey.
[Michael Allison] began his university career as an artist and musician… and somehow down the line, became a Technoshaman. His thesis, presented at ITP 2014, is on computational art, virtual reality, cognitive psychology and his research on various religious, spiritual and scientific methods that try to explain the relationship between our bodies, minds and the universe itself.
Using virtual reality, Ecstatic Computation is a ritual that explores the merging of consciousness and quantum energy in the physio-chemical registration of state within the computer’s memory. The moment when human and computer become one; the moment when thought becomes bit and electrons become ideas.
Sound crazy? Maybe — but check out the video demonstrations after the break. To create this experience he’s using an Oculus Rift, a Microsoft Kinect, a fan, a small keyboard and of course a computer to render it all. During the participant’s journey, [Michael] leads them in flight, passing through a quantum tunnel, merging with quantum energy inside of state registration within the computer’s memory and finally ending by falling into infinity.
All the graphics and effects are generated on the fly using GLSL generation using a robust graphics rendered called Smolder which he wrote himself, which is built on top of Cinder.
Continue reading “Ecstatic Computation: Exploring Technoshamanism with Virtual Reality”
The old gen 1 Kinect has seen a fair bit of use in the field of making 3D scans out of real world scenes. Now that Xbox 360 Kinects are winding up at yard sales and your local Goodwill, you might even have a chance to pick one up for pocket change. Until now, though, scanning objects in 3D has only been practical in a studio or workshop setting; for a mobile, portable scanner, you’d need to lug around a computer, a power supply, and it’s not really something you can fit in a back pack.
Now, finally, that may be changing. [xxorde] can now get depth data from a Kinect sensor with a Raspberry Pi. And with just about every other ARM board out there as well. It’s a kernel driver that’s small, fast, and does just one thing: turns the Kinect into a webcam that displays depth data.
Of course, a portabalized Kinect 3D scanner has been done before, but that was with an absurdly expensive Gumstix board. With a Raspi or BeagleBone Black, this driver has the beginnings of a very cheap 3D scanner that would be much more useful than the current commercial or DIY desktop scanners.
We went to “the dark room” at Maker Faire once more for an interview with [Sarah] of Robot-Army. She and [Mark], who handles software development for the project, were showing off 30 delta robots who know how to dance. Specifically they’re dancing in unison to the movements of another faire-goer. A Kinect sensor monitors those movements and translates them to matching motions from the deltabots.
You should remember seeing this project back in November. Now that the standards for this model have been worked out it was just a matter of sinking about three-weeks into assembling the army. We’re happy to see that the Kickstarter made it to 250% of the goal at the beginning of March, and with that there are even bigger plans. [Sarah] says the goal remains to fill a room with the robots and a we may even see a much larger version some day.
The interview is a bit short since the Robot-Army booth was right next to Arc Attack (hence the noise-cancelling headphones) and we had to try to get in and out between their ear-drum-shattering interruptions. But you can see a ton more about the project in this huge build log post over on Hackaday.io. Also check out the Robot-Army webpage. There’s a nice illustration of their adventures at MFBA and the foam Jolly Wrencher made it into the piece!