You’re the Only One not Playing with Unity

It wasn’t too long ago that one could conjecture that most hackers are not avid video game players. We spend most of our free time taking things apart, tinkering with microcontrollers and reading the latest [Jenny List] article on Hackaday.com. When we do think of video games, our neurons generally fire in the direction of emulating a console on a single board computer, such as a Raspberry Pi or a Beaglebone. Or even emulating the actual console processor on an FPGA. Rarely do we venture off into 3D programs meant to make modern video games. If we can’t export an .STL with it, we’re not interested. It’s just not our bag.

Oculus Rift changed this. The VR headset was originally invented for 3D video games, but quickly became a darling to hackers the world over. Virtual Reality technology is far bigger than just video games, and brings opportunity to many fields such as real estate, construction, product visualization, education, social interaction… the list goes on and on.

The Oculus team got together with the folks over at Unity in the early days to make it easy for video game makers to make content for the Rift. Unity is a game engine designed with a shallow learning curve and is available for free for non-commercial use. The Oculus Rift can be integrated into a Unity environment with the check of a setting and importing a small package, available on the Oculus site. This makes it easy for anyone interested in VR technology to get a Rift and start pumping out content.

Hackers have taken things a step further and have written scripts that allow Unity to communicate with an Arduino. VR is fun. But VR plus physical reality is just down right exciting! In this article, we’re going to walk you through setting up your Oculus Rift and Unity game engine to communicate with the outside world via an Arduino.

Continue reading “You’re the Only One not Playing with Unity”

The Ninja Run: a VR Movement Experiment

VR is an area that is seeing plenty of DIY experimentation, and [FultonX] has an interesting hack of sorts in that he’s discovered something that meshes well with how we perceive motion and movement. It’s an experimental movement system for VR he calls the Ninja Run, and it somewhat resembles skiing.

ninja-run-analysis-optimizedEven room-scale VR suffers from the fact that the player is more or less stuck in one place. Moving the player from one spot to another isn’t currently a gracefully solved problem, and many existing methods are not immersive or have other drawbacks. One solution in use is a sort of teleportation, another “slides” the player to another area on command (like gliding across ice). [FultonX] found these existing solutions lacking, and prototyped the Ninja Run concept which he found was surprisingly intuitive and effective. Video demo embedded below.

Continue reading “The Ninja Run: a VR Movement Experiment”

Hexapod Tank from Ghost in the Shell Brought to Life

Every now and then someone gets seriously inspired, and that urge just doesn’t go away until something gets created. For [Paulius Liekis], it led to creating a roughly 1:20 scale version of the T08A2 Hexapod “Spider” Tank from the movie Ghost in the Shell. As the he puts it, “[T]his was something that I wanted to build for a long time and I just had to get it out of my system.” It uses two Raspberry Pi computers, 28 servo motors, and required over 250 hours of 3D printing for all the meticulously modeled pieces – and even more than that for polishing, filing, painting, and other finishing work on the pieces after they were printed. The paint job is spectacular, with great-looking wear and tear. It’s even better seeing it in motion — see the video embedded below.

Continue reading “Hexapod Tank from Ghost in the Shell Brought to Life”

Real-time Driving of RGB LED Cube using Unity3D

RGB LED cubes are great, but building the cube is only half the battle – they also need to be driven. The larger the cube, the bigger the canvas you have to exercise your performance art, and the more intense the data visualization headache. This project solves the problem by using Unity to drive an RGB LED cube in real-time.

Landscape animation RGB Cube - smallWe’re not just talking about driving the LEDs themselves at a low level, but how you what you want to display in each of those 512 pixels.

In the video, you can see [TylerTimoJ]’s demo of an 8x8x8 cube being driven in real-time using the Unity engine. A variety of methods are demonstrated from turning individual LEDs on and off, coloring swaths of the cube as though with a paintbrush, and even having the cube display source image data in real-time (as shown on the left.)

Continue reading “Real-time Driving of RGB LED Cube using Unity3D”

Amazing Oscilloscope Graphics

From what we can understand, [ompuco] has built a 2D audio output on top of the Unity game engine, enabling him to output X and Y values from his stereo soundcard straight to an oscilloscope in XY mode. His code simply scans through all the vertexes in the scene and outputs the right voltages into the left and right audio streams. He’s using this to create some pretty incredible animations. Check out the video “additives” below for an example. (See if you can figure out what’s being “added”.)

Continue reading “Amazing Oscilloscope Graphics”

Here Be Dragons, and VR…and sheep.

dragonVR

This may qualify less as a hack and more as clever combination of video game input devices, but we thought it was well worth showing off. [Jack] and his team built Dragon Eyes from scratch at the 2013 Dundee Dare Jam. If you’re unfamiliar with “Game Jams” and have any aspirations of working in the video game industry, we highly recommend that you find one and participate. With only 48 hours to design, code, build assets and test, many teams struggle to finish their entry. Dragon Eyes, however, uses the indie-favorite game engine Unity3D to smoothly coordinate its input devices, allowing players to experience dragon flight. The Kinect reads the player’s arm positions (including flapping) to direct the wings for travel, while the Oculus Rift performs its usual job as immersive VR headgear.

Combining a Kinect and a Rift isn’t particularly uncommon, but the function of the microphone is. By blowing into a headset microphone, players activate the dragon’s fire-breathing. How’s that for interactivity? You can see [Jack] roasting some sheep in a demonstration video below. If you have a Kinect and Rift lying around and want some first-person dragon action, [Jack] has kindly provided a download of the build in the project link above.

We’re looking forward to more implementations of the Rift; we haven’t seen many just yet. You can, however, check out a Rift used as an aerial camera on a drone.

Continue reading “Here Be Dragons, and VR…and sheep.”