3D Printering: Scanning 3D Models

The Makerbot Digitizer was announced this week, giving anyone with $1400 the ability to scan small objects and print out a copy on any 3D printer.

Given the vitriol spewed against Makerbot in the Hackaday comments and other forums on the Internet, it should be very obvious the sets of Hackaday readers and the target demographic Makerbot is developing and marketing towards do not intersect. We’re thinking anyone reading this would rather roll up their sleeves and build a 3D scanner, but where to start? Below are a few options out there for those of you who want a 3D scanner but are none too keen on Makerbot’s offering.

Continue reading “3D Printering: Scanning 3D Models”

AquaTop: A Gaming Touch Display That Looks Like Demon Possessed Water

AquaTop_touch_displayAre you ready to make a utility sink sized pool of water the location of your next living room game console? This demonstration is appealing, but maybe not ready for widespread adoption. AquaTop is an interactive display that combines water, a projector, and a depth camera.

The water has bath salts added to it which turn it a milky white. This does double duty, making it a reasonably reflective surface for the projector, and hiding your hands when below the surface. The video below shows several different games being played. But the most compelling demonstration involves individual finger tracking when your digits break the surface of the water (show on the right above).

There is also a novel feedback system. The researchers hacked some speakers so they could be submerged in the tank, adding a large speaker with LEDs on it in the same manner. When fed a 50 Hz signal they make the surface of the pool dance.

Continue reading “AquaTop: A Gaming Touch Display That Looks Like Demon Possessed Water”

Here Be Dragons, And VR…and Sheep.

dragonVR

This may qualify less as a hack and more as clever combination of video game input devices, but we thought it was well worth showing off. [Jack] and his team built Dragon Eyes from scratch at the 2013 Dundee Dare Jam. If you’re unfamiliar with “Game Jams” and have any aspirations of working in the video game industry, we highly recommend that you find one and participate. With only 48 hours to design, code, build assets and test, many teams struggle to finish their entry. Dragon Eyes, however, uses the indie-favorite game engine Unity3D to smoothly coordinate its input devices, allowing players to experience dragon flight. The Kinect reads the player’s arm positions (including flapping) to direct the wings for travel, while the Oculus Rift performs its usual job as immersive VR headgear.

Combining a Kinect and a Rift isn’t particularly uncommon, but the function of the microphone is. By blowing into a headset microphone, players activate the dragon’s fire-breathing. How’s that for interactivity? You can see [Jack] roasting some sheep in a demonstration video below. If you have a Kinect and Rift lying around and want some first-person dragon action, [Jack] has kindly provided a download of the build in the project link above.

We’re looking forward to more implementations of the Rift; we haven’t seen many just yet. You can, however, check out a Rift used as an aerial camera on a drone.

Continue reading “Here Be Dragons, And VR…and Sheep.”

Kinect Full Body Scanner

kinect-full-body-scanner

Why let the TSA have all the fun when it comes to full body scanning? Not only can you get a digital model of yourself, but you can print it out to scale.

[Moheeb Zara] is still in development with a Kinect based full body scanner. But he took a bit of time to show off the first working prototype. The parts that went into the build were either cut on a bandsaw, laser cut, or 3D printed. The scanning part of the rig uses a free-standing vertical rail which allows the Kinect to move along the Z axis. The sled is held in place by gravity and moved up the rail using a winch with some steel cable looped over a pulley at the top.

The subject stands on a rotating platform which [Moheeb] designed and assembled. Beneath the platform you’ll find a laser cut hoop with teeth on the inside. A motor mounted in a 3D printed bracket uses these teeth to rotate the platform. He’s still got some work to do in order to automate the platform. For this demo he move each step in the scanning process using manual switches. Captured data is assembled into a virtual module using ReconstructMe.

The Kinect has been used as a 3D scanner like this before. But that time it was scanning salable goods rather than people.

Continue reading “Kinect Full Body Scanner”

3D Mapping Of Rooms, Again

Last year we saw what may be the coolest application of a Kinect ever. It was called Kintinuous, and it’s back again, this time as Kintinuous 2.0, with new and improved features.

When we first learned of Kintinuous, we were blown away. The ability for a computer with a Kinect to map large-scale areas has applications as diverse as Google Street View, creating custom Counter-Strike maps, to archeological excavations. There was one problem with the Kintinuous 1.0, though: scanning a loop would create a disjointed map, where the beginning and end of a loop would be in a different place.

In the video for Kintinuous 2.0, you can see a huge scan over 300 meters in length with two loops automatically stitched back into a continuous scan. An amazing feat, especially considering the computer is processing seven million vertices in just a few seconds.

Unfortunately, it doesn’t look like there will be an official distribution of Kintinuous 2.0 anytime soon. The paper for this Kintinuous is still under review, and there are ‘issues’ surrounding the software that don’t allow an answer to the if and when question of release. Once the paper is out, though, anyone is free to reimplement it, and we’ll gladly leave that as an open challenge to our readers.

Continue reading “3D Mapping Of Rooms, Again”

Human Asteroids Makes You A Vector Triangle Ship

asteroids

In 1979, [Nolan Bushnell] released Asteroids to the world. Now, he’s playing the game again, only this time with the help of a laser projector and a Kinect that turns anyone sitting on a stool – in this case [Nolan] himself – into everyone’s favorite vector spaceship. It’s a project for Steam Carnival, a project by [Brent Bushnell] and [Eric Gradman] that hopes to bring a modern electronic carnival to your town.

The reimagined Asteroids game was created with a laser projector to display the asteroids and ship on a floor. A Kinect tracks the user sitting and rolling on a stool while a smart phone is the triangular spaceship’s ‘fire’ button. The game is played in a 150 square foot arena, and is able to put anyone behind the cockpit of an asteroid mining triangle.

[Brent] and [Eric] hope to bring their steam carnival to LA and San Francisco next spring, but if they exceed their funding goals, they might be convinced to bring their show east of the Mississippi. We’d love to try it out by hiding behind the score like the original Asteroids and wasting several hours.

Continue reading “Human Asteroids Makes You A Vector Triangle Ship”

Charlotte, The Hexapod With 3D Vision

spider

Charlotte’s chassis comes from as a kit, but the stock electronics are based on an Arduino – not something for a robot that needs to run computer vision apps. [Kevin] ended up using a Raspi for the controller and gave Charlotte eyes with an Asus XTION. Edit: or a PrimeSense sensor These sensors are structured light depth cameras just like the kinect, only about smaller, lighter, and have a better color output.

Hardware is only one half of the equation, so [Kevin] tossed the Arduino-based stock electronics and replaced them with a Raspberry Pi. This allowed him to hone his C++ skills and add one very cool peripheral – the XTION depth camera.

To the surprise of many, we’re sure, [Kevin] is running OpenNI on his Raspberry Pi, allowing Charlotte to take readings from her depth camera and keep from colliding into any objects. The Raspberry Pi is overclocked, of course, and the CPU usage is hovering around 90%, but if you’re looking for a project that uses a depth sensor with a Pi, there you go.

Continue reading “Charlotte, The Hexapod With 3D Vision”