Castrol Virtual Drift: Hacking Code at 80MPH with a Driver in a VR Helmet

Driving a brand new 670 horsepower Roucsh stage 3 Mustang while wearing virtual reality goggles. Sounds nuts right? That’s exactly what Castrol Oil’s advertising agency came up with though. They didn’t want to just make a commercial though – they wanted to do the real thing. Enter [Adam and Glenn], the engineers who were tasked with getting data from the car into a high end gaming PC. The computer was running a custom simulation under the Unreal Engine. El Toro field provided a vast expanse of empty tarmac to drive the car without worry of hitting any real world obstacles.

The Oculus Rift was never designed to be operated inside a moving vehicle, so it presented a unique challenge for [Adam and Glenn]. Every time the car turned or spun, the Oculus’ on-board Inertial Measurement Unit (IMU) would think driver [Matt Powers] was turning his head. At one point [Matt] was trying to drive while the game engine had him sitting in the passenger seat turned sideways. The solution was to install a 9 degree of freedom IMU in the car, then subtract the movements of that IMU from the one in the Rift.

GPS data came from a Real Time Kinematic (RTK) GPS unit. Unfortunately, the GPS had a 5Hz update rate – not nearly fast enough for a car moving close to 100 MPH. The GPS was relegated to aligning the virtual and real worlds at the start of the simulation. The rest of the data came from the IMUs and the car’s own CAN bus. [Adam and Glenn] used an Arduino with a Microchip mcp2515 can bus interface  to read values such as steering angle, throttle position, brake pressure, and wheel spin. The data was then passed on to the Unreal engine. The Arduino code is up on Github, though the team had to sanitize some of Ford’s proprietary CAN message data to avoid a lawsuit. It’s worth noting that [Adam and Glenn] didn’t have any support from Ford on this, they just sniffed the CAN network to determine each message ID.

The final video has the Hollywood treatment. “In game” footage has been replaced with pre-rendered sequences, which look so good we’d think the whole thing was fake, that is if we didn’t know better.

Click past the break for the final commercial and some behind the scenes footage.

Continue reading “Castrol Virtual Drift: Hacking Code at 80MPH with a Driver in a VR Helmet”

ANUBIS, A Natural User Bot Interface System

[Matt], [Andrew], [Noah], and [Tim] have a pretty interesting build for their capstone project at Ohio Northern University. They’re using a Microsoft Kinect, and a Leap Motion to create a natural user interface for controlling humanoid robots.

The robot the team is using for this project is a tracked humanoid robot they’ve affectionately come to call Johnny Five.  Johnny takes commands from a computer, Kinect, and Leap motion to move the chassis, arm, and gripper around in a way that’s somewhat natural, and surely a lot easier than controlling a humanoid robot with a keyboard.

The team has also released all their software onto Github under an open source license. You can grab that over on the Gits, or take a look at some of the pics and videos from the Columbus Mini Maker Faire.

CES: Building Booths and Simulating Reality

My first day on the ground at CES started with a somewhat amusing wait at the Taxi Stand of the McCarran International Airport. Actually I’m getting ahead of myself… it started with a surprisingly efficient badge-pickup booth in the baggage claim of the airport. Wait in line for about three minutes, and show them the QR code emailed to you from online registration and you’re ready to move to the 1/4 mile-long, six-switchback deep line for cabs. Yeah, there’s a lot of people here for this conference.

It’s striking just how huge this thing is. Every hotel on the strip is crawling with badge-wearing CES attendees. Many of the conference halls in the hotels are filled with booths, meaning the thing is spread out over a huge geographic area. We bought three-day monorail passes and headed to the convention center to get started.

Building the Booths

[Sophi] knows [Ben Unsworth] who put his heart and soul into this year’s IEEE booth. His company, Globacore, builds booths for conferences and this one sounds like it was an exceptional amount of fun to work on. He was part of a tiny team that built a mind-controlled drag strip based on Emotive Insight brainwave measuring hardware shipped directly from the first factory production run. This ties in with the display screens above the track to form a leader board. We’ll have a keen eye out for hacks this week, but the story behind building these booths may be the best hack to be found.


[Ben] told us hands-down the thing to see is the new Oculus hardware called Crescent Bay. He emphatically mentioned The Holodeck which is a comparison we don’t throw around lightly. Seems like a lot of people feel that way because the line to try it out is wicked long. We downloaded their app which allows you to schedule a demo but all appointments are already taken. Hopefully our Twitter plea will be seen by their crew.

In the meantime we tried out the Oculus Gear VR. It uses a Galaxy Note 4 as the screen along with lenses and a variety of motion tracking and user controls. The demo was a Zelda-like game where you view the scene from overhead. This used a handheld controller to command the in-game character with the headset’s motion tracking used to look around the playing area. It was a neat demo, I’m not quite sold on long gaming sessions with the hardware but maybe I just need to get used full-immersion first.

Window to another Dimension


The midways close at six o’clock and we made our way to the Occipital booth just as they were winding done. I’ve been 3D scanned a few times before but those systems used turntables and depth cameras on motorized tracks to do the work. This uses a depth-camera add-on for an iPad which they call Structure Sensor.

It is striking how quickly the rig can capture a model. This high-speed performance is parlayed into other uses, like creating a virtual world inside the iPad which the user navigates by using the screen as if it were a magic window into another dimension. Their demo was something along the lines of the game Portal and has us thinking that the Wii U controller has the right idea for entertainment, but it needs the performance that Occipital offers. I liked this experience more than the Oculus demo because you are not shut off from the real world as you make your way through the virtual.

We shot some video of the hardware and plan to post more about it as soon as we get the time to edit the footage.

Find Us or Follow Us

josh-can-hardwareWe’re wearing our Hackaday shirts and that stopped [Josh] in his tracks. He’s here on business with his company Evermind, but like any good hacker he is carrying around one of his passion projects in his pocket. What he’s showing off are a couple of prototypes for a CANbus sniffer and interface device that he’s build.

We’ll be at CES all week. You can follow our progress through the following Twitter accounts: @Hackaday, @HackadayPrize, @Szczys, and @SophiKravitz. If you’re here in person you can Tweet us to find where we are. We’re also planning a 9am Thursday Breakfast meetup at SambaLatte in the Monte Carlo. We hope you’ll stop by and say hi. Don’t forget to bring your own hardware!


Using the Oculus Rift as a Multi-Monitor Replacement

[Jason] has been playing around with the Oculus Rift lately and came up with a pretty cool software demonstration. It’s probably been done before in some way shape or form, but we love the idea anyway and he’s actually released the program so you can play with it too!

It’s basically a 3D Windows Manager, aptly called 3DWM — though he’s thinking of changing the name to something a bit cooler, maybe the WorkSphere or something.

As he shows in the following video demonstration, the software allows you to set up multiple desktops and windows on your virtual sphere created by the Oculus — essentially creating a virtual multi-monitor setup. There’s a few obvious cons to this setup which makes it a bit unpractical at the moment. Like the inability to see your keyboard (though this shouldn’t really be a problem), the inability to see people around you… and of course the hardware and it’s lack of proper resolution. But besides that, it’s still pretty awesome!

In the future development he hopes to add Kinect 2 and Leap Motion controller integration to help make it even more user intuitive — maybe more Minority Report style.

Continue reading “Using the Oculus Rift as a Multi-Monitor Replacement”

Seeing The World Through Depth Sensing Cameras

The Oculus Rift and all the other 3D video goggle solutions out there are great if you want to explore virtual worlds with stereoscopic vision, but until now we haven’t seen anyone exploring real life with digital stereoscopic viewers. [pabr] combined the Kinect-like sensor in an ASUS Xtion with a smartphone in a Google Cardboard-like setup for 3D views the human eye can’t naturally experience like a third-person view, a radar-like display, and seeing what the world would look like with your eyes 20 inches apart.

[pabr] is using an ASUS Xtion depth sensor connected to a Galaxy SIII via the USB OTG port. With a little bit of code, the output from the depth sensor can be pushed to the phone’s display. The hardware setup consists of a VR-Spective, a rather expensive bit of plastic, but with the right mechanical considerations, a piece of cardboard or some foam board and hot glue would do quite nicely.

[pabr] put together a video demo of his build, along with a few examples of what this project can do. It’s rather odd, and surprisingly not a superfluous way to see in 3D. You can check out that video below.

Continue reading “Seeing The World Through Depth Sensing Cameras”

Bring A Hack at World Maker Faire 2014

After a hard Saturday at World Maker Faire, some of the best and brightest in the Hacker/Maker community descended on The Holiday Inn for “Bring A Hack”. Created by [Jeri Ellsworth] several years ago at the Bay Area Maker Faire, Bring A Hack (BAH) is an informal gathering. Sometimes a dinner, sometimes a group getting together at a local bar, BAH is has just one rule: You have to bring a hack!

[Sophi Kravitz] has become the unofficial event organizer for BAH in New York. This year she did a bit of live hacking, as she converted her Wobble Wonder headgear from wired to wireless control.

[Chris Gammell] brought his original Bench BudEE from Contextual Electronics. He showed off a few of his board customizations, including making a TSSOP part fit on the wrong footprint.

BAH-eggbotsmall[Windell and Lenore] from Evil Mad Scientist Laboratories brought a few hacks along. They picked up an old Radio Shack music player chip at the Electronics Flea Market and built it up on a breadboard. Also on display was their new EggBot Pro. The Pro is a beautifully machined version of the eggbot. Everything is built strong to withstand the sort of duty an EggBot would see at a hackerspace or public library. [Windell] was full of surprises, as he also gave everyone chunks of Sal Ammoniac, which is a great way to bring the tin back to a tired soldering iron tip. The hack was that he found his Sal Ammoniac at a local Indian grocery in the Bay Area. Check out [Windell’s] blog entry for more information.

BAH-diyVRSmall[Cal Howard] brought his DIY VR goggles. [Cal] converted a Kindle Fire into an Oculus Rift style head mounted display by adding a couple of magnifying lenses, some bamboo kebab sticks to hold the lenses in place. Judicious use of cardboard and duct tape completed the project. His current hurdle is getting past the Fire’s lack of an accelerometer. [Cal] planned to spend Sunday at Maker Faire adding one of his own!

As the hour grew late, everyone started to trickle out. Tired but happy from a long day at Maker Faire, the Bring A Hacker partygoers headed back to their hotels to get some sleep before World Maker Faire’s final day.

3D Printed Virtual Reality Goggles


Oculus, as we know, was acquired by Facebook for $2 billion, and now the VR community has been buzzing about trying to figure out what to do with all this newly accessible technology. And adding to the interest, the 2nd iteration of the development kits were released, causing a resurgence in virtual reality development as computer generated experiences started pouring out from of every corner of the world. But not everyone can afford the $350 USD price tag to purchase one of these devices, bringing out the need for Do-It-Yourself projects like these 3D printed wearable video goggles via Adafruit.

The design of this project is reminiscent of the VR2GO mobile viewer that came out of the MxR Lab (aka the research environment that spun out Palmer Lucky before he created Oculus). However, the hardware here is more robust and utilizes a 5.6″ display and 50mm aspheric lenses instead of a regular smart phone. The HD monitor is held within a 3D printed enclosure along with an Arduino Micro and 9-DOF motion sensor. The outer hood of the case is composed of a combination of PLA and Ninjaflex printing-filament, keeping the fame rigid while the area around the eyes remain flexible and comfortable. The faceplate is secured with a mounting bracket and a pair of aspheric lenses inside split the screen for stereoscopic video. Head straps were added allowing for the device to fit snugly on one’s face.

At the end of the tutorial, the instructions state that once everything is assembled, all that is required afterwards is to plug in a 9V power adapter and an HDMI cable sourcing video from somewhere else. This should get the console up and running; but it would be interesting to see if this design in the future can eliminate the wires and make this into a portable unit. Regardless of which, this project does a fantastic job at showing what it takes to create a homemade virtual reality device. And as you can see from the product list after the break, the price of the project fits under the $350 DK2 amount, helping to save some money while still providing a fun and educational experience.

Continue reading “3D Printed Virtual Reality Goggles”