Here is a virtual spray painting project with a new and DIY twist to it. [Adam Amaral]’s project is an experiment in using the Vive Tracker, which was released earlier this year. [Adam] demonstrates how to interface some simple hardware and 3D printed parts to the Tracker’s GPIO pins, using it as a custom peripheral that is fully tracked and interactive in the Vive’s VR environment. He details not only the custom spray can controller, but also how to handle the device on the software side in the Unreal engine. The 3D printed “spray can controller” even rattles when shaken!
There’s one more trick. Since the Vive Tracker is wireless and completely self-contained, the completed rattlecan operates independently from the VR headset. This means it’s possible to ditch the goggles and hook up a projector, then use the 3D printed spray can to paint a nearby wall with virtual paint; you can see that part in action in the video embedded below.
Continue reading “Spray Paint Goes DIY Virtual with a Vive Tracker”
Taking a stroll through the woods in the midst of autumn is a stunning visual experience. It does, however, require one to live nearby a forest. If you are one of those who does not, [Koen Hufkens] has recently launched the Virtual Forest project — a VR experience that takes you though a day in a deciduous forest.
First off, you don’t need a VR apparatus to view the scenery. Web-browsers and most smart phones are capable of displaying the 360 degree images. The Raspberry Pi 2-controlled Ricoh Theta S camera is enclosed in a glass lamp cover and — with the help of some PVC pipe — mounted on a standard fence post. Power is delivered ingeniously via a Cat5e cable, and a surge protector has also been included in case of lightning strikes. Depending on when you view the website, you could be confronted with a black screen, or a kaleidoscope of color.
Continue reading “A Virtual Glimpse Into The Forest”
Inspired by playing The Legend of Zelda video game series, Cornell University students [Mohamed Abdellatif] and [Michael Ross] created a Virtual Archery game as their ECE 4760 Final Project. The game consists of a bow equipped with virtual arrows and a target placed about 20 ft away. The player has three rounds to get as high of a score as possible. A small display monitor shows the instructions, and an image of where the shot actually hit on the target.
Pressing a button on the front of the bow readies a virtual arrow. A stretch sensor communicates with a microcontroller to determine when the bow string has been drawn and released. When the bow is drawn, a line of LEDs lights up to simulate a notched arrow. The player aims, and factors in for gravity. An accelerometer calculates the orientation of the bow when fired. The calculated shot is then shown on the display monitor along with your score.
This immediately makes me think of Laser Tag, and feels like a product that could easily be mass marketed. I’m surprised it hasn’t been already. Good work guys.
Check out the video demonstration after the break:
Continue reading “Virtual Archery game makes practicing convenient, safe, and inexpensive”
[Diego] wrote in to let us know about the haptic feedback arm project with which he’s hard at work. He calls it the Vimphin, which is uses the beginning letters from the words: Virtual Manipulator Physical Interface. Instead of a claw, the robot arm has a hand grip that lets you easily move it around. That is unless the virtual model of the arm encounters a dense substance, and then it’s going to be more difficult to move.
The test arm seen above includes several high quality robotic servo motors. You probably know that servo motors have feedback circuits that let them sense their position, and this is what is used to detect when a user moves the arm. This movement is tracked in the virtual 3D environment seen on the screen. In this case, the base of the robot is sitting in a pool of water. When the end of the virtual arm is in open air it’s pretty easy to move. When it dips below the water line the motors are used to increase resistance, simulating movement through a denser substance.
This sounds like a great piece of hardware to have around when the OASIS is finally developed.
Continue reading “Robot arm provides haptic feedback from the virtual world”
While we have seen Kinect-based virtual dressing rooms before, the team at Arbuzz is taking a slightly different approach (Translation) to the digital dress up game. Rather than using flat images of clothes superimposed on the subject’s body, their solution uses full 3D models of the clothing to achieve the desired effect. This method allows them to create a more true to life experience, where the clothing follows the subject around, flowing naturally with the user’s movements.
Like many other Kinect hacks, they use openNI and NITE to obtain skeletal data from the sensor. The application itself was written in C# with Microsoft’s XNA game development tools, and uses a special physics engine to render the simulated cloth in a realistic fashion
[Lukasz] says that the system is still in its infancy, and will require plenty of work before they are completely happy with the results. From where we’re sitting, the demo video embedded below is pretty neat, even if it is a bit rough around the edges. We were particularly pleased to see the Xbox’s native Kinect interface put to work in a DIY project, and we are quite interested to see how things look once they put the final touches on it.
Continue reading “Play dress up with Kinect”
[Denha’s] been building marble machines for years and decided to look a back on some of his favorite marble-based builds (translated). There’s a slew of them, as well as some thoughts about each. Our favorite part is the digital simulations of the projects. For instance, the image above shows a flip-flop marble machine that was built in a physics simulator. This makes it a lot easier to plan for the physical build as it will tell you exact dimensions before you cut your first piece of material. Both of these images were pulled from videos which can be seen after the break. But this isn’t the most hard-core of pre-build planning. SolidWorks, a CAD suite that is most often used to design 3D models for precision machining, has also been used to model the more intricate machines.
Continue reading “Marble machines roundup”
[John B] is a software engineer and had some spare time on his hands, so he started messing around with his Kinect which had been sitting unused for awhile. He wanted to see what he could create if he was able to get Kinect data into a virtual environment that supported real-world physics. The first idea that popped into his head was to interface the Kinect with Garry’s Mod.
If you are not familiar with Garry’s Mod, it is a sandbox environment built on top of Valve’s Source engine. The environment supports real-world physics, but beyond that, it pretty much lets you do or build anything you want. [John] found that there was no good way to get Kinect data into the software, so he built his own.
He used OpenNI to gather skeletal coordinate data from Kinect, which was then passed to some custom code that packages those coordinates inside UDP packets. Those packets are then sent to a custom Lua script that is interpreted by Garry’s Mod.
The result is just plain awesome as you can see in the video below. Instead of simply playing some random game with the Kinect, you get to design the entire experience from the ground up. The project is still in its infancy, but it’s pretty certain that we’ll see some cool stuff in short order. All of the code is available on github, so give it a shot and share your videos with us.
Continue reading “Kinect hacked to work with Garry’s Mod means endless hours of virtual fun”