Human Asteroids Makes You A Vector Triangle Ship

asteroids

In 1979, [Nolan Bushnell] released Asteroids to the world. Now, he’s playing the game again, only this time with the help of a laser projector and a Kinect that turns anyone sitting on a stool – in this case [Nolan] himself – into everyone’s favorite vector spaceship. It’s a project for Steam Carnival, a project by [Brent Bushnell] and [Eric Gradman] that hopes to bring a modern electronic carnival to your town.

The reimagined Asteroids game was created with a laser projector to display the asteroids and ship on a floor. A Kinect tracks the user sitting and rolling on a stool while a smart phone is the triangular spaceship’s ‘fire’ button. The game is played in a 150 square foot arena, and is able to put anyone behind the cockpit of an asteroid mining triangle.

[Brent] and [Eric] hope to bring their steam carnival to LA and San Francisco next spring, but if they exceed their funding goals, they might be convinced to bring their show east of the Mississippi. We’d love to try it out by hiding behind the score like the original Asteroids and wasting several hours.

Continue reading “Human Asteroids Makes You A Vector Triangle Ship”

Ask Hackaday: What Are We Going To Do With The New Kinect?

kinect

Yesterday Microsoft announced their new cable box, the Xbox One. Included in the announcement is a vastly improved Kinect sensor. It won’t be available until next Christmas, but now the question is what are we going to do with it?

From what initial specs that can be found, the new version of the Kinect will output RGB 1080p video over a USB 3.0 connection to the new Xbox. The IR depth camera of the original Kinect has been replaced with a time of flight camera – a camera that is able to send out a pulse of light and time how long it takes for photons to be reflected back to the camera. While there have been some inroads into making low-cost ToF cameras – namely Intel and Creative’s Interactive Gesture Camera Development Kit and the $250 DepthSense 325 from SoftKinetic – the Kinect 2.0 will be the first time of flight camera you’ll be able to buy for a few hundred bucks at any Walmart.

We’ve seen a ton of awesome Kinect hacks over the years. Everything from a ‘holographic display’ that turns any TV into a 3D display, computer vision for robots, and a 3D scanner among others. A new Kinect sensor with better 3D resolution can only improve existing projects and the time of flight sensor – like the one found in Google’s driverless car – opens up the door for a whole bunch of new projects.

So, readers of Hackaday, assuming someone can write a driver in a few days like the Kinect 1.0, what are we going to do with it?

While we’re at it, keep in mind we made a call for Wii U controller hacks. If somebody can crack that nut, it’ll be an awesome remote for robots and FPV airplanes and drones.

Microsoft IllumiRoom Breaks Your Video Game Out Of Its Television Prison

microsoft-illumiroom

We see a lot of video game tech coming out of the three console giants (Microsoft, Sony, and Nintendo). With one look we can usually predict what is going to be a flop. Case and point is the Wii U whose sales have been less than extraordinary and Sony Move which is motion control directed as hardcore games who we believe are perfectly happy with the current evolution of their dual shock controllers. But this time around we think Microsoft has it nailed. They’re showing off technology they call IllumiRoom which uses a projector to bring your entire gaming room into the experience.

The image above is not doctored. This is a picture of IllumiRoom in action. A projector on the coffee table automatically calibrates to the room (using Kinect 3D data for mapping) in order to show realistic graphic rendering on the non-flat projection surfaces. In our mind, this comes straight out of Kinect hacking projects like the Hadouken projector. With this in place, the game designers are given free rein to come up with all kinds of different ways to use the feature. Stick with us after the break to see what they’ve developed.

Continue reading “Microsoft IllumiRoom Breaks Your Video Game Out Of Its Television Prison”

Putting Yourself Inside A Display

dome

Here’s an interesting build that combines light, sound, and gesture recognition to make a 360 degree environment of light and sound. It’s called The Bit Dome, and while the pictures and video are very cool, we’re sure it’s more impressive in real life.

The dome is constructed of over a hundred triangles made of foam insulation sheet, resulting in a structure that is 10 feet in diameter and seven and a half feet tall. Every corner of these panels has an RGB LED driven by a Rainboduino, which is in turn controlled by a computer hooked up to a Kinect.

The process of interacting with the dome begins by stepping inside and activating the calibration process. By having the user point their arms at different points inside the dome, the computer can reliably tell where the user is pointing, and respond when the user cycles through the dome’s functions.

There are bunch of things this dome can do, such as allowing the user to conduct an audio-visual light show, run a meditation program, or even play Snake and Pac-Man. You can check out these games and more in the videos after the break.

Continue reading “Putting Yourself Inside A Display”

Robot Stroller Lets Baby Steer Without Mowing Down Other Toddlers

We’ve seen strollers and car seats that have a steering wheel for the baby to play with (like in the opening of The Simpsons). But what we hadn’t seen is a stroller that allows baby to actually steer. You might think that a putting a motorized vehicle in the hands of someone so young is an accident waiting to happen. But [Xandon Frogget] thought of that and used familiar hardware to add some safety features.

The stroller seen above is a tricycle setup, making it quite easy to add motors to the two rear wheels. These are controlled by a tablet which you can see nestled on the canopy of the stroller (look for the light reflected on the glass). This interfaces with two Kinect sensors, one pointing forward and the other pointing back. They continually scan the environment, looking for obstacles in the stroller’s path. You can see [Xandon’s] little girl holding a Wii Wheel, which connects with the tablet to facilitate steering. A test run at the playground is embedded after the break.

Continue reading “Robot Stroller Lets Baby Steer Without Mowing Down Other Toddlers”

A Portable, WiFi-enabled Kinect

The builds using a Kinect as a 3D scanner just keep getting better and better. A team of researchers from the University of Bristol have portablized the Kinect by adding a battery, single board Linux computer, and a WiFi adapter. With their Mobile Kinect project, it’s now a snap to automatically map an environment without lugging a laptop around, or just giving your next mobile robot an awesome vision system.

By making the Kinect portable, [Mike] et al made the Microsoft’s 3D imaging device much more capable than its present task of computing the volumetric space of the inside of a cabinet. The Reconstructme project allows the Kinect to be used as a hand-held 3D scanner and Kintinuous can be used to create a 3D model of entire houses, buildings, or caves.

There’s a lot that can be done with a portabalized, WiFi’d Kinect, and hopefully a few builds replicating the team’s work (except for replacing the Gumstix board with a Raspi) will be showing up on HaD shortly.

Video after the break.

Continue reading “A Portable, WiFi-enabled Kinect”

Help Computer Vision Researchers, Get A 3d Model Of Your Living Room

Robots can easily make their way across a factory floor; with painted lines on the floor, a factory makes for an ideal environment for a robot to navigate. A much more difficult test of computer vision lies in your living room. Finding a way around a coffee table and not knocking over a lamp present a huge challenge for any autonomous robot. Researchers at the Royal Institute of Technology in Sweden are working on this problem, but they need your help.

[Alper Aydemir], [Rasmus Göransson] and Prof. [Patric Jensfelt] at the Centre for Autonomous Systems in Stockholm created Kinect@Home. The idea is simple: by modeling hundreds of living rooms in 3D, the computer vision and robotics researchers will have a fantastic library to train their algorithms.

To help out the Kinect@Home team, all that is needed is a Kinect, just like the one lying disused in your cupboard. After signing up on the Kinect@Home site, you’re able to create a 3D model of your living room, den, or office right in your browser. This 3D model is then added to the Kinect@Home library for CV researchers around the world.