CES: Building Booths and Simulating Reality

My first day on the ground at CES started with a somewhat amusing wait at the Taxi Stand of the McCarran International Airport. Actually I’m getting ahead of myself… it started with a surprisingly efficient badge-pickup booth in the baggage claim of the airport. Wait in line for about three minutes, and show them the QR code emailed to you from online registration and you’re ready to move to the 1/4 mile-long, six-switchback deep line for cabs. Yeah, there’s a lot of people here for this conference.

It’s striking just how huge this thing is. Every hotel on the strip is crawling with badge-wearing CES attendees. Many of the conference halls in the hotels are filled with booths, meaning the thing is spread out over a huge geographic area. We bought three-day monorail passes and headed to the convention center to get started.

Building the Booths

[Sophi] knows [Ben Unsworth] who put his heart and soul into this year’s IEEE booth. His company, Globacore, builds booths for conferences and this one sounds like it was an exceptional amount of fun to work on. He was part of a tiny team that built a mind-controlled drag strip based on Emotive Insight brainwave measuring hardware shipped directly from the first factory production run. This ties in with the display screens above the track to form a leader board. We’ll have a keen eye out for hacks this week, but the story behind building these booths may be the best hack to be found.


[Ben] told us hands-down the thing to see is the new Oculus hardware called Crescent Bay. He emphatically mentioned The Holodeck which is a comparison we don’t throw around lightly. Seems like a lot of people feel that way because the line to try it out is wicked long. We downloaded their app which allows you to schedule a demo but all appointments are already taken. Hopefully our Twitter plea will be seen by their crew.

In the meantime we tried out the Oculus Gear VR. It uses a Galaxy Note 4 as the screen along with lenses and a variety of motion tracking and user controls. The demo was a Zelda-like game where you view the scene from overhead. This used a handheld controller to command the in-game character with the headset’s motion tracking used to look around the playing area. It was a neat demo, I’m not quite sold on long gaming sessions with the hardware but maybe I just need to get used full-immersion first.

Window to another Dimension


The midways close at six o’clock and we made our way to the Occipital booth just as they were winding done. I’ve been 3D scanned a few times before but those systems used turntables and depth cameras on motorized tracks to do the work. This uses a depth-camera add-on for an iPad which they call Structure Sensor.

It is striking how quickly the rig can capture a model. This high-speed performance is parlayed into other uses, like creating a virtual world inside the iPad which the user navigates by using the screen as if it were a magic window into another dimension. Their demo was something along the lines of the game Portal and has us thinking that the Wii U controller has the right idea for entertainment, but it needs the performance that Occipital offers. I liked this experience more than the Oculus demo because you are not shut off from the real world as you make your way through the virtual.

We shot some video of the hardware and plan to post more about it as soon as we get the time to edit the footage.

Find Us or Follow Us

josh-can-hardwareWe’re wearing our Hackaday shirts and that stopped [Josh] in his tracks. He’s here on business with his company Evermind, but like any good hacker he is carrying around one of his passion projects in his pocket. What he’s showing off are a couple of prototypes for a CANbus sniffer and interface device that he’s build.

We’ll be at CES all week. You can follow our progress through the following Twitter accounts: @Hackaday, @HackadayPrize, @Szczys, and @SophiKravitz. If you’re here in person you can Tweet us to find where we are. We’re also planning a 9am Thursday Breakfast meetup at SambaLatte in the Monte Carlo. We hope you’ll stop by and say hi. Don’t forget to bring your own hardware!


Touching Light with Haptic Feedback

Many of us have gone on a stationary romp through some virtual or augmented scape with one of the few headsets out in the wild today. While the experience of viewing a convincing figment of reality is an exciting sensation in itself, [Mark Lee] and [Kevin Wang] are figuring out how to tie other senses into the mix.

The duo from Cornell University have built a mechanical exoskeleton that responds to light with haptic feedback. This means the wearer can touch the sphere of light around a source as if it were a solid object. Photo resistors are mounted like antenna to the tip of each finger, which they filed down around the edges to receive a more  diffused amount of light. When the wearer of the apparatus moves their hand towards a light source, the sensors trigger servo motors mounted on the back of the hand to actuate and retract a series of 3D printed tendons which arch upward and connect to the individual fingers of the wearer. This way as the resistors receive varying amounts of light, they can react independently to simulate physical contours.

One of the goals of the project was to produce a working proof of concept with no more than 100 dollars worth of materials, which [Mark] and [Kevin] achieve with some cash to spare. Their list of parts can be found on their blog along with some more details on the project.

Continue reading “Touching Light with Haptic Feedback”

Ask Hackaday: What is The Future of Virtual Reality?

Most of us have heard of Second Life – that antiquated online virtual reality platform of yesteryear where users could explore, create, and even sell content.  You might be surprised to learn that not only are they still around, but they’re also employing the Oculus Rift and completely redesigning their virtual world. With support of the DK2 Rift, the possibilities for a Second Life platform where users can share and explore each other’s creations opens up some interesting doors.

Envision a world where you could log on to a “virtual net”, put on your favorite VR headset and let your imagination run wild. You and some friends could make a city, a planet…and entire universe that you and thousands of others could explore. With a little bit of dreaming and an arduino, VR can bring dreams to life.

Continue reading “Ask Hackaday: What is The Future of Virtual Reality?”

Seeing The World Through Depth Sensing Cameras

The Oculus Rift and all the other 3D video goggle solutions out there are great if you want to explore virtual worlds with stereoscopic vision, but until now we haven’t seen anyone exploring real life with digital stereoscopic viewers. [pabr] combined the Kinect-like sensor in an ASUS Xtion with a smartphone in a Google Cardboard-like setup for 3D views the human eye can’t naturally experience like a third-person view, a radar-like display, and seeing what the world would look like with your eyes 20 inches apart.

[pabr] is using an ASUS Xtion depth sensor connected to a Galaxy SIII via the USB OTG port. With a little bit of code, the output from the depth sensor can be pushed to the phone’s display. The hardware setup consists of a VR-Spective, a rather expensive bit of plastic, but with the right mechanical considerations, a piece of cardboard or some foam board and hot glue would do quite nicely.

[pabr] put together a video demo of his build, along with a few examples of what this project can do. It’s rather odd, and surprisingly not a superfluous way to see in 3D. You can check out that video below.

Continue reading “Seeing The World Through Depth Sensing Cameras”

Interacting with Virtual Reality Brings us Even Closer to a Real Holodeck

One of our readers has been playing around with virtual reality lately, and has come up with a pretty cool beta run of his research — virtual interaction using your hands.

Using an Oculus Rift, the Leap Motion controller and a beta run of Unity 4.6, [Tomáš Mariančík] put together a test environment for physical interaction. The Leap Motion controller is capable of tracking your fingers with extremely high detail, which allows him to create a pair of virtual hands inside the test environment that almost perfectly mimic his movements. The hack here is making it all work together.

In the following demo he shows off by interacting with holographic menus, grabbing body parts off of anatomically correct human being (thanks to Unity3D), and manipulating his environment.

Continue reading “Interacting with Virtual Reality Brings us Even Closer to a Real Holodeck”

‘Duinos and VR Environments

At the Atmel booth at Maker Faire, they were showing off a few very cool bits and baubles. We’ve got a post on the WiFi shield in the works, but the most impressive person at the booth was [Quin]. He has a company, he’s already shipping products, and he has a few projects already in the works. What were you doing at 13?

[Quin]’s Qduino Mini is your basic Arduino compatible board with a LiPo charging circuit. There’s also a ‘fuel gauge’ of sorts for the battery. The project will be hitting Kickstarter sometime next month, and we’ll probably put that up in a links post.

Oh, [Quin] was also rocking some awesome kicks at the Faire. Atmel, I’m trying to give you money for these shoes, but you’re not taking it.

[Sophie] had a really cool installation at the faire, and notably something that was first featured on hackaday.io. Basically, it’s a virtually reality Segway, built with an Oculus, Leap Motion, a Wobbleboard, an Android that allows you to cruise on everyone’s favorite barely-cool balancing scooter through a virtual landscape.

This project was a collaboration between [Sophie], [Takafumi Ide], [Adelle Lin], and [Martha Hipley]. The virtual landscape was built in Unity, displayed on the Oculus, controlled with an accelerometer on a phone, and has input with a Leap Motion. There are destructible and interactable things in the environment that can be pushed around with the Leap Motion, and with the helmet-mounted fans, you can feel the wind in your hair as you cruise over the landscape on your hovering Segway-like vehicle. This is really one of the best VR projects we’ve ever seen.

Flying a Drone with an Oculus Rift 


Controlling autonomous vehicles remotely with the use of virtual reality headsets seems like an obvious next step. Already, a few UAV companies have begun experimenting with these types of ideas by integrating Oculus Rift developer kits into their hovering quadcopters and drones. Parrot released a video on their blog showing that they developed a head-tracking system for their Bebop Drone in an effort to bring FPV flights to fruition. It looks like a lot of fun and we want to try one of these out asap!

As for technical specifications, they can be found in the YouTube description of the video embedded below. A quick glance showed that the operating system is based on Android and uses WiFi to connect the handheld tablet to the autonomous vehicle floating above. The range is a whopping 2km, giving plenty of freedom to explore. Moving one’s head swivels the attached camera giving a more immersive flying experience.

This isn’t the first example of FPV drones that we have seen. Previously, we covered an Oculus Rift + Head Tracking setup and another similar integration with a Black Armor Drone. We are bound to see virtual reality equipment used to control drones more and more as developers get their hands on cutting edge hardware like the Oculus developer kit 2 hardware which is currently shipping.

Continue reading “Flying a Drone with an Oculus Rift “