Creepy Cat Eyes with a Microsoft Kinect

F8X0VISHV3Q1HLI.LARGE

Ever feel like someone is watching you? Like, somewhere in the back of your mind, you can feel the peering eyes of something glancing at you? Tapping into that paranoia, is this Computer Science graduate project that was created during a “Tangible Interactive Computing” class at the University of Maryland by two bright young students named [Josh] and [Richard], with the help of HCIL hackerspace.

Their Professor [Dr. Jon Froehlich] wanted the students to ‘seamlessly couple the dual worlds of bits and atoms’ and create something that would ‘explore the materiality of interactive computing.’ And this relatively simple idea does just that, guaranteeing some good reactions. 

As you’ve probably gathered from the title, this project uses a Microsoft Kinect to track the movement of nearby people. The output is then translated into actionable controls of the mounted eyeballs producing a creepy vibe radiating out from the feline, robot poster.

[Read more...]

Virtual Physical Reality With Kintinuous And An Oculus Rift

oculus

The Kinect has long been able to create realistic 3D models of real, physical spaces. Combining these Kinect-mapped spaces with an Oculus Rift is something brand new entirely.

[Thomas] and his fellow compatriots within the Kintinuous project are modeling an office space with the old XBox 360 Kinect’s RGB+D sensors. then using an Oculus Rift to inhabit that space. They’re not using the internal IMU in the Oculus to position the camera in the virtual space, either: they’re using live depth sensing from the Kinect to feed the Rift screens.

While Kintinuous is very, very good at mapping large-scale spaces, the software itself if locked up behind some copyright concerns the authors and devs don’t have control over. This doesn’t mean the techniques behind Kintinuous are locked up, however: anyone is free to read the papers (here’s one, and another, PDF of course) and re-implement Kintinuous as an open source project. That’s something that would be really cool, and we’d encourage anyone with a bit of experience with point clouds to give it a shot.

Video below.

[Read more...]

The Race Is On To Build A Raspi Kinect 3D Scanner

pinect

The old gen 1 Kinect has seen a fair bit of use in the field of making 3D scans out of real world scenes. Now that Xbox 360 Kinects are winding up at yard sales and your local Goodwill, you might even have a chance to pick one up for pocket change. Until now, though, scanning objects in 3D has only been practical in a studio or workshop setting; for a mobile, portable scanner, you’d need to lug around a computer, a power supply, and it’s not really something you can fit in a back pack.

Now, finally, that may be changing. [xxorde] can now get depth data from a Kinect sensor with a Raspberry Pi. And with just about every other ARM board out there as well. It’s a kernel driver that’s small, fast, and does just one thing: turns the Kinect into a webcam that displays depth data.

Of course, a portabalized Kinect 3D scanner has been done before, but that was with an absurdly expensive Gumstix board. With a Raspi or BeagleBone Black, this driver has the beginnings of a very cheap 3D scanner that would be much more useful than the current commercial or DIY desktop scanners.

Virtual Reality Gets Real with 3 Kinect Cameras

kinects-capture

No, that isn’t a scene from a horror movie up there, it’s [Oliver Kreylos'] avatar in a 3D office environment. If he looks a bit strange, it’s because he’s wearing an Oculus Rift, and his image is being stitched together from 3 Microsoft Kinect cameras.

[Oliver] has created a 3D environment which is incredibly realistic, at least to the wearer. He believes the secret is in the low latency of the entire system. When coupled with a good 3D environment, like the office shown above, the mind is tricked into believing it is really in the room. [Oliver] mentions that he finds himself subconsciously moving to avoid bumping into a table leg that he knows isn’t there. In [Oliver's] words, “It circumnavigates the uncanny valley“.

Instead of pulling skeleton data from the 3 Kinect cameras, [Oliver] is using video and depth data. He’s stitching and processing this data on an i7 Linux box with an Nvidia Geforce GTX 770 video card. Powerful hardware for sure, but not the cutting edge monster rig one might expect. [Oliver] also documented his software stack. He’s using Vrui VR Toolkit, the Kinect 3D Video Capture Project, and the Collaboration Infrastructure.

We can’t wait to see what [Oliver] does when he gets his hands on the Kinect One (and some good Linux drivers for it).

[Read more...]

Autonomous Lighting with Intelligence

myra_light_01_29

Getting into home automation usually starts with lighting, like hacking your lights to automatically turn on when motion is detected, timer controls, or even tying everything into an app on your smart phone. [Ken] took things to a completely different level, by giving his lighting intelligence.

The system is called ‘Myra’, and it works by detecting what you’re doing in the room, and based on this, robotic lights will optimally adjust to the activity. For example, if you’re walking through the room, the system will attempt to illuminate your path as you walk. Other activities are detected as well, like reading a book, watching TV, or just standing still.

At the heart of the ‘Myra’ system is an RGBD Sensor (Microsoft Kinect/Asus Xtion). The space in the room is processed by a PC running an application to determine the current ‘activity’. Wireless robotic lights are strategically placed around the room; each with a 2-servo system and standalone Arduino. The PC sends out commands to each light with an angle for the two axis and the intensity of the light. The lights receive this command wirelessly via a 315MHz receiver, and the Arduino then ‘aims’ the beam according to the command.

This isn’t the first time we’ve seen [Ken’s] work; a couple of years ago we saw his extremely unique ‘real life’ weather display.  The ‘Myra’ system is still a work in progress, so we can’t wait to see how it all ends up.  Be sure to check out the video after the break for a demo of the system.

[Read more...]

Holograms With The New Kinect

kinect

The Xbox One is out, along with a new Kinect sensor, and this time around Microsoft didn’t waste any time making this 3D vision sensor available for Windows. [programming4fun] got his hands on the new Kinect v2 sensor and started work on a capture system to import anything into a virtual environment.

We’ve seen [programming4fun]‘s work before with an extremely odd and original build that turns any display into a 3D display with the help of a Kinect v1 sensor. This time around, [programming] isn’t just using a Kinect to display a 3D object, he’s also using a Kinect to capture 3D data.

[programming] captured himself playing a few chords on a guitar with the new Kinect v2 sensor. This was saved to a custom file format that can be played back in the Unity engine. With the help of a Kinect v1, [programming4fun] can pan and tilt around this virtual model simply by moving his head.

If that’s not enough, [programming] has also included support for the Oculus Rift, turning the Unity-based virtual copy of himself into something he can interact with in a video game.

As far as we can tell, this is the first build on Hackaday using the new Kinect sensor. We asked what everyone was going to do with this new improved hardware, and from [programming]‘s demo, it seems like there’s still a lot of unexplored potential with the new Xbox One spybox.

[Read more...]

A New Way to Heat People

heat spotlight

[Leigh Christie] is a researcher at MIT, and he’s developed an interesting solution to heating people, not buildings.

His TEDx talk, “Heating Buildings is Stupid,” demonstrates the MIT SENSEable City Laboratory’s efforts to tackle energy issues. Their research focuses on finding an alternative to the staggering waste of energy used to heat large spaces. Although TED talk articles are a rarity at Hackaday, we think this idea is both simple and useful. Also, [Leigh] is the same guy who brought us the Mondo Spider a few years ago for the Burning Man exhibition. He’s a hacker.

Anyway, what is it? The system he’s devised is so simple that it’s brilliant: a person-tracking infrared heat spotlight. Using a Microsoft Kinect, the lamp follows you around and keeps the individual warm rather than the entire space. [Leigh] has grand plans for implementing what he calls “Local Heating” in large buildings to save on energy consumption, but smaller-scale implementations could prove equally beneficial for a big garage or a workshop. How much does your workspace cost to heat during the winter? Hackerspaces seem like the perfect test environment for a cobbled-together “Local Heating” system. If anyone builds one, we want to hear about it.

Check out the full TEDx talk after the break.

[Read more...]

Follow

Get every new post delivered to your Inbox.

Join 92,391 other followers