Kyocera is vastly expanding their product lineup with the Shop Sink 3530. The perfect addition to your copiers, fax machines, and laser printers.
About a year and a half ago and with objections from the editorial staff, we did a Top 10 hacking fails in movies and TV post. The number one fail is, “Stupid crime shows like NCIS, CSI, and Bones.” A new show on CBS just topped this list. It’s named Scorpion, and wow. Dropping a Cat5 cable from an airplane doing an almost-touch-and-go because something is wrong with the computers in the tower. Four million adults age 18-49 watched this.
[Derek] found something that really looks like the Hackaday logo in a spacer of some kind. It’s been sitting on his shelf for a few months, and is only now sending it in. He picked it up in a pile of scrap metal, and he (and we) really have no idea what this thing is. Any guesses?
[Art] has another, ‘what is this thing’. He has two of them, and he’s pretty sure it’s some sort of differential, but other than that he’s got nothing. The only real clue is that [Art] lives near a harbor on the N. Cali coast. Maybe from a navigation system, or a governor from a weird diesel?
So you have a Kinect sitting on a shelf somewhere. That’s fine, we completely understand that. Here’s something: freeze yourself in carbonite. Yeah, it turns out having a depth sensor is exactly what you need to make a carbonite copy of yourself.
The people at Two Bit Circus are at it again; this time with a futuristic racing simulator where the user controls the experience. It was developed by [Brent Bushnell] and [Eric Gradman] along with a handful of engineers and designers in Los Angeles, California. The immersive gaming chair utilized an actual racing seat in the design, and foot petals were added to give the driver more of a feeling like they were actually in a real race. Cooling fans were placed on top for haptic feedback and a Microsoft Kinect was integrated into the system as well to detect hand gestures that would control what was placed on the various screens.
The team completed the project within in thirty days during a challenge from Best Buy who wanted to see if they could create the future of viewing experiences. Problems surfaced throughout the time frame though creating obstacles surrounding the video cards, monitors, and shipping dates. They got it done and are looking towards integrating their work into restaurants like Dave & Buster’s and other facilities like arcades and bars (at least that’s the rumor going around town). The 5 part mini-series that was produced around this device can be seen after the break:
Ever feel like someone is watching you? Like, somewhere in the back of your mind, you can feel the peering eyes of something glancing at you? Tapping into that paranoia, is this Computer Science graduate project that was created during a “Tangible Interactive Computing” class at the University of Maryland by two bright young students named [Josh] and [Richard], with the help of HCIL hackerspace.
Their Professor [Dr. Jon Froehlich] wanted the students to ‘seamlessly couple the dual worlds of bits and atoms’ and create something that would ‘explore the materiality of interactive computing.’ And this relatively simple idea does just that, guaranteeing some good reactions.
As you’ve probably gathered from the title, this project uses a Microsoft Kinect to track the movement of nearby people. The output is then translated into actionable controls of the mounted eyeballs producing a creepy vibe radiating out from the feline, robot poster.
On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.
This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.
Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.
An earlier (un-smoothed) version of the full system can be seen after the break:
[Thomas] and his fellow compatriots within the Kintinuous project are modeling an office space with the old XBox 360 Kinect’s RGB+D sensors. then using an Oculus Rift to inhabit that space. They’re not using the internal IMU in the Oculus to position the camera in the virtual space, either: they’re using live depth sensing from the Kinect to feed the Rift screens.
While Kintinuous is very, very good at mapping large-scale spaces, the software itself if locked up behind some copyright concerns the authors and devs don’t have control over. This doesn’t mean the techniques behind Kintinuous are locked up, however: anyone is free to read the papers (here’s one, and another, PDF of course) and re-implement Kintinuous as an open source project. That’s something that would be really cool, and we’d encourage anyone with a bit of experience with point clouds to give it a shot.
Here’s a really clever use for the Oculus Rift — Ecstatic Computation, a virtual reality spirit journey.
[Michael Allison] began his university career as an artist and musician… and somehow down the line, became a Technoshaman. His thesis, presented at ITP 2014, is on computational art, virtual reality, cognitive psychology and his research on various religious, spiritual and scientific methods that try to explain the relationship between our bodies, minds and the universe itself.
Using virtual reality, Ecstatic Computation is a ritual that explores the merging of consciousness and quantum energy in the physio-chemical registration of state within the computer’s memory. The moment when human and computer become one; the moment when thought becomes bit and electrons become ideas.
Sound crazy? Maybe — but check out the video demonstrations after the break. To create this experience he’s using an Oculus Rift, a Microsoft Kinect, a fan, a small keyboard and of course a computer to render it all. During the participant’s journey, [Michael] leads them in flight, passing through a quantum tunnel, merging with quantum energy inside of state registration within the computer’s memory and finally ending by falling into infinity.
All the graphics and effects are generated on the fly using GLSL generation using a robust graphics rendered called Smolder which he wrote himself, which is built on top of Cinder.
The old gen 1 Kinect has seen a fair bit of use in the field of making 3D scans out of real world scenes. Now that Xbox 360 Kinects are winding up at yard sales and your local Goodwill, you might even have a chance to pick one up for pocket change. Until now, though, scanning objects in 3D has only been practical in a studio or workshop setting; for a mobile, portable scanner, you’d need to lug around a computer, a power supply, and it’s not really something you can fit in a back pack.
Of course, a portabalized Kinect 3D scanner has been done before, but that was with an absurdly expensive Gumstix board. With a Raspi or BeagleBone Black, this driver has the beginnings of a very cheap 3D scanner that would be much more useful than the current commercial or DIY desktop scanners.