Mustachioed Nintendo Virtual Boy Gone Augmented Reality

Some people just want to watch the world burn. Others want to spread peace, joy and mustaches. [Joe Grand] falls into the latter group this time around. His latest creation is Mustache Mayhem, a hack, video game, and art project all rolled into one. This is a bit of a change from deconstructing circuit boards or designing electronic badges, but not completely new for [Joe], who wrote SCSIcide and Ultra SCSIcide for the Atari 2600 back in the early 2000’s.

Mustache Mayhem is built into a Nintendo Virtual Boy housing. The Virtual Boy itself was broken, and unfortunately was beyond repair. [Joe] removed most of the stock electronics and added a BeagleBone Black, Logitech C920 webcam, an LCD screen and some custom electronics. He kept the original audio amplifier, speakers, and controller connector. Angstrom Linux boots into [Joe’s] software, which uses OpenCV to detect faces and overlay mustaches. Gameplay is simple: Point the console at one or more faces. If you see a mustache, press the A button on the controller! The more faces and mustaches on-screen at once, the more points, or “mojo” the player gets. The code is up on Github, and can be built with Xcode targeted to the Mac, or directly on the BeagleBone Black.

[Joe’s] goal for the project was to make a ridiculous game that looks like it could have come out in the 90’s. He also used Mustache Mayhem as a fun way to learn some new skills which will come in handy for more serious projects in the future.

We caught up with [Joe] for a quick interview about his new creation.

How did you come up with the idea for Mustache Mayhem?

blockI was selling a bunch of my video game collection at PRGE (Portland Retro Gaming Expo) a few years ago and had a broken Virtual Boy that no one bought. A friend of mine was at the table and said I had to do something with it. I thought “People wear cosplay and walk around at conventions, so what if I could do something with the Virtual Boy that you could walk around with?” That was the seed.

A few months later, Texas Instruments sent me the original production release of the BeagleBone Black (rev. A5A). Eighteen months after that I actually started the project. The catalyst was to do something for an upcoming Portland, OR art show (Byte Me 4.0), which is an annual event that shows off interactive technology-based artwork. I wrote up a little description and got accepted. I had less than 2 months to actually get things working and it ended up taking about a month of full-time work. It was much more work than I expected for such a silly project. I originally was going to do something along the lines of walking around in a Doom-like perspective and shooting people when their faces were detected.

That would be pretty darn cool. How did you get from Doom to Mustaches? 

I saw a TI BeagleBoard demo called “boothstache” which drew mustaches on faces and tweeted the pictures. I thought that doing something non-violent with mustaches would be more suitable (and funny) to actually show my kids. I also secretly wanted to use this project as a way to experiment with Linux, write some code, and learn about face detection and image processing with OpenCV, which I plan to use for some actual computer security research in the future. Mustache Mayhem turned out to be a super cool project and I’m really happy with it. I sort of feel guilty spending so much time on it, since it’s basically just a one-off prototype, but I just got so obsessed with making it exactly as I wanted.

You mentioned on your website that Mustache was “designed to challenge the paradigms of personal privacy and entertainment.” What exactly did you mean there?

Continue reading “Mustachioed Nintendo Virtual Boy Gone Augmented Reality”

CES: Meetups, Augmented Reality, And Robots

Hackaday started off Thursday of the Consumer Electronics Show with an impromptu breakfast meetup. This turns out to be a wonderful thing as it lets you ease into a 16 hour day of standing, walking, talking, and getting lost trying to find your way from conference hall to conference hall. We had a great turnout and many brought their hacks and demos to show off. A big thanks to the Sambalatte staff who are awesome people and top tier baristas.

CastAR

DSC_0354

Before leaving for CES I was talking to [Ben Krasnow] about what we should try to see and he suggested looking for private showings that are given in the suites of the hotels at the conference. Turns out our friends at Technical Illusions are doing just that. [Jeri] and [Rick] were showing off CastAR in a suite during the week and were nice enough to make room in their booked schedule for a private Demo.

What you see above are the guts of the version they are currently shipping as part of their Kickstarter fulfillment. I also got a look at a rev2 prototype and will write a follow-up post with more information on the whole experience when I have more time.

Eureka, Startups!

There is a loop of aisles in the Sands that has startup booths and most of the interesting things I saw on Wednesday and Thursday are there. Here we have a jamming gripper robot arm. It’s designed for things like moving oddly shaped goods on a manufacturing line. Empire Robotics hit a homerun with their demo for the booth, a take on beer-bong: robot versus human. The scoreboard showed the robot winning an order of magnitude more than the humans.

[Todd] was at was at the Tinkerines booth showing off 3D printers aimed to augmented the STEM curriculum. We couldn’t help but notice his TIE fighter right and inquired about it. He modeled the design himself, send it off to be cast in silver, and inlaid the stone when the ring came back from the casting service. Sweet!

LVBots

[Sarah Petkus] clued me in and gave me a ride to the Pololu CES open house. The night coincided with the LVBots meetup which they support by providing space for the meetings. There were lots of cool robots being shown off. What you see here was just the pre-meeting warmup of line-followers and sumo robots. I shot some video of the show-and-tell which we’ll post once we’ve had a chance to edit the content.

Closing out CES

wpid-wp-1420824368323.jpegToday is the last day of the conference. I stopped by the Voltera PCB printer booth yesterday but they were nowhere to be found. Turns out they were being handed a $50k check by TechCrunch for winning the Battleground. I suppose we’ll give them a pass for not being at the table during that!

I’ll be headed over this afternoon to catch up with them. I’m also hoping to get a look at the Voxel8 printer. If you have any other “can’t-miss” suggestions let me know in the comments and I’ll try to add them to my CES dance card.

Augmented Reality Pinball

Pinball machines are fascinating pieces of mechanical and electrical engineering, and now [Yair Moshe] and his students at the Israel Institute of Technology has taken the classic game one step further.  Using computer vision and a projector, this group of engineers has created an augmented reality pinball game that takes pinball to a whole new level.

Once the laptop, webcam, and projector are set up, a course is drawn on a whiteboard which the computer “sees” to determine the rules of the game. Any course you can imagine can be drawn on the whiteboard too, with an interesting set of rules that no regular pinball game could take advantage of. Most notably, the ball can change size when it hits certain types of objects, which makes for a very interesting and unconventional style of play.

The player uses their hands to control the flippers as well, but not with buttons. The computer watches the position of the player’s hands and flips the flippers when it sees a hand in the right position. [Yair] and his students recently showed this project off at DLD Tel Aviv and even got [Shimon Perez], former President of Israel, to play some pinball at the conference!

Open Source Marker Recognition For Augmented Reality

marker

[Bharath] recently uploaded the source code for an OpenCV based pattern recognition platform that can be used for Augmented Reality, or even robots. It was built with C++ and utilized the OpenCV library to translate marker notations within a single frame.

The program started out by focusing in on one object at a time. This method was chosen to eliminate the creation of additional arrays that contained information of all of the blobs inside the image; which could cause some problems.

Although this implementation did not track marker information through multiple frames, it did provide a nice foundation for integrating pattern recognition into computer systems. The tutorial was straightforward and easy to ready. The entire program and source code can be found on Github which comes with a ZERO license so that anyone can use it. A video of the program comes up after the break:

Continue reading “Open Source Marker Recognition For Augmented Reality”

Augmented Reality With An FPGA

 

bruceinabox

 

[Julie Wang] has created an augmented reality system on a Field Programmable Gate Array (FPGA). Augmented reality is nothing new – heck, these days even your tablet can do it. [Julie] has taken a slightly different approach though. She’s not using a processor at all. Her entire system, from capture, to image processing, to VGA signal output, is all instantiated in a FPGA.

Using the system is as simple as holding up a green square of cardboard. Viewing the world through an old camcorder, [Julie’s] project detects and tracks the green square. It then adds a 3D image of Cornell’s McGraw Tower on top of the green. The tower moves with the cardboard, appearing to be there. [Julie] injected a bit of humor into the project through the option of substituting the tower for an image of her professor, [Bruce Land].

[Julie] started with an NTSC video signal. The video is captured by a DE2-115 board with an Altera Cyclone IV FPGA. Once the signal was inside the FPGA, [Julie’s] code performs a median filter. A color detector finds an area of green pixels which are passed to a corner follower and corner median filter. The tower or Bruce images are loaded from ROM and overlaid on the video stream, which is then output via VGA.

The amazing part is that there is no microprocessor involved in any of the processing. Logic and state machines control the show. Great work [Julie], we hope [Bruce] gives you an A!

Continue reading “Augmented Reality With An FPGA”

Oculus Rift Goes From Virtual To Augmented Reality

[William Steptoe] is a post-doctoral research associate at University College London. This means he gets to play with some really cool hardware. His most recent project is an augmented reality update to the Oculus Rift. This is much more than hacking a pair of cameras on the Rift though. [William] has created an entire AR/VR user interface, complete with dockable web browser screens. He started with a stock Rift, and a room decked out with a professional motion capture system. The Rift was made wireless with the addition of an ASUS Wavi and a laptop battery system. [William] found that the wireless link added no appreciable latency to the Rift. To move into the realm of augmented reality, [William] added a pair of Logitech C310 cameras. The C310 lens’ field of view was a bit narrow for what he needed, so lenses from a Genius WideCam F100 were swapped in. The Logitech cameras were stripped down to the board level, and mounted on 3D printed brackets that clip onto the Rift’s display. Shapelock was added to the mounts to allow the convergence of the cameras to be easily set.

Stereo camera calibration is a difficult and processor intensive process. Add to that multiple tracking systems (both the 6DOF head tracking on the Rift, and the video tracker built-in to the room) and you’ve got quite a difficult computational process. [William] found that he needed to use a Unity shader running on his PC’s graphics card to get the system to operate in real-time.  The results are quite stunning. We didn’t have a Rift handy to view the 3D portions of [William’s] video. However,  the sense of presence in the room still showed through. Videos like this make us excited for the future of augmented reality applications, with the Rift, the upcoming castAR, and with other systems.

Continue reading “Oculus Rift Goes From Virtual To Augmented Reality”

Augmented Reality Breadboarding

[Scott] sent in this tantalizing view of the what could be the future of bread boarding. His day job is at EquipCodes, where he’s working on augmented reality systems for the industrial sector. Most of EquipCodes augmented reality demos involve large electric motors and power transmission systems. When someone suggested a breadboard demo, [Scott] was able to create a simple 555 led blinker circuit as a proof of concept. The results are stunning. An AR glyph tells the software what circuit it is currently viewing. The software then shows a layout of the circuit. Each component can be selected to bring up further information.

The system also acts as a tutor for first time circuit builders – showing  them where each component and wire should go. We couldn’t help but think of our old Radio Shack 150 in 1 circuit kit while watching [Scott] assemble the 555 blinker. A breadboard would be a lot more fun than all those old springs! The “virtual” layout can even be overlayed on real one. Any misplaced components would show up before power is turned on (and the magic smoke escapes).

Now we realize this is just a technology demonstrator. Any circuit to be built would have to exist in the software’s database. Simple editing software like Fritzing could be helpful in this case. We’re also not sure how easy it would be working with a tablet between you and your circuit. A pair of CastAR glasses would definitely come in handy here. Even so, we’re excited by this video and hope that some of this augmented reality technology makes its way into our hands.

Continue reading “Augmented Reality Breadboarding”