[Julie Wang] has created an augmented reality system on a Field Programmable Gate Array (FPGA). Augmented reality is nothing new – heck, these days even your tablet can do it. [Julie] has taken a slightly different approach though. She’s not using a processor at all. Her entire system, from capture, to image processing, to VGA signal output, is all instantiated in a FPGA.
Using the system is as simple as holding up a green square of cardboard. Viewing the world through an old camcorder, [Julie’s] project detects and tracks the green square. It then adds a 3D image of Cornell’s McGraw Tower on top of the green. The tower moves with the cardboard, appearing to be there. [Julie] injected a bit of humor into the project through the option of substituting the tower for an image of her professor, [Bruce Land].
[Julie] started with an NTSC video signal. The video is captured by a DE2-115 board with an Altera Cyclone IV FPGA. Once the signal was inside the FPGA, [Julie’s] code performs a median filter. A color detector finds an area of green pixels which are passed to a corner follower and corner median filter. The tower or Bruce images are loaded from ROM and overlaid on the video stream, which is then output via VGA.
The amazing part is that there is no microprocessor involved in any of the processing. Logic and state machines control the show. Great work [Julie], we hope [Bruce] gives you an A!
Continue reading “Augmented Reality with an FPGA”
No good at pool? Never fear, Cassapa is here! [Alex Porto] has created an augmented reality system for playing pool, and it means almost anyone can make those cool trick shots!
Ca-what? Cassapa (“caçapa”) is a Portuguese word for pool table pocket. The software works by placing a webcam directly above the pool table for image recognition. Dedicated software interprets the image and identifies the position of the holes, borders, balls and the cue which can then be used to calculate game physics. A projector then projects the forecast physics and allows you to make tiny adjustments — updated in real-time — to make the perfect shot.
Unfortunately, having a big projector shining down on your pool table won’t exactly make anyone believe you’re actually good at pool. Although if you could combine this with Google Glass or any other vision augmenting goggles… that would be pretty cool. Well, you’d still be terribly dishonest and a cheater — but anyway, take a look at the video after the break.
Continue reading “Cassapa: Augmented Pool”
[William Steptoe] is a post-doctoral research associate at University College London. This means he gets to play with some really cool hardware. His most recent project is an augmented reality update to the Oculus Rift. This is much more than hacking a pair of cameras on the Rift though. [William] has created an entire AR/VR user interface, complete with dockable web browser screens. He started with a stock Rift, and a room decked out with a professional motion capture system. The Rift was made wireless with the addition of an ASUS Wavi and a laptop battery system. [William] found that the wireless link added no appreciable latency to the Rift. To move into the realm of augmented reality, [William] added a pair of Logitech C310 cameras. The C310 lens’ field of view was a bit narrow for what he needed, so lenses from a Genius WideCam F100 were swapped in. The Logitech cameras were stripped down to the board level, and mounted on 3D printed brackets that clip onto the Rift’s display. Shapelock was added to the mounts to allow the convergence of the cameras to be easily set.
Stereo camera calibration is a difficult and processor intensive process. Add to that multiple tracking systems (both the 6DOF head tracking on the Rift, and the video tracker built-in to the room) and you’ve got quite a difficult computational process. [William] found that he needed to use a Unity shader running on his PC’s graphics card to get the system to operate in real-time. The results are quite stunning. We didn’t have a Rift handy to view the 3D portions of [William’s] video. However, the sense of presence in the room still showed through. Videos like this make us excited for the future of augmented reality applications, with the Rift, the upcoming castAR, and with other systems.
Continue reading “Oculus Rift Goes from Virtual to Augmented Reality”
[Scott] sent in this tantalizing view of the what could be the future of bread boarding. His day job is at EquipCodes, where he’s working on augmented reality systems for the industrial sector. Most of EquipCodes augmented reality demos involve large electric motors and power transmission systems. When someone suggested a breadboard demo, [Scott] was able to create a simple 555 led blinker circuit as a proof of concept. The results are stunning. An AR glyph tells the software what circuit it is currently viewing. The software then shows a layout of the circuit. Each component can be selected to bring up further information.
The system also acts as a tutor for first time circuit builders – showing them where each component and wire should go. We couldn’t help but think of our old Radio Shack 150 in 1 circuit kit while watching [Scott] assemble the 555 blinker. A breadboard would be a lot more fun than all those old springs! The “virtual” layout can even be overlayed on real one. Any misplaced components would show up before power is turned on (and the magic smoke escapes).
Now we realize this is just a technology demonstrator. Any circuit to be built would have to exist in the software’s database. Simple editing software like Fritzing could be helpful in this case. We’re also not sure how easy it would be working with a tablet between you and your circuit. A pair of CastAR glasses would definitely come in handy here. Even so, we’re excited by this video and hope that some of this augmented reality technology makes its way into our hands.
Continue reading “Augmented Reality Breadboarding”
[Jeri, Rick and the Technical Illusions crew] have taken the castAR to Kickstarter. We’ve covered castAR a couple of times in the past, but the Kickstarter includes a few new features just ripe for the hacking. First, castAR is no longer confined to a retro-reflective surface. In fact, it’s no longer confined to augmented reality. An optional clip on adapter converts castAR into a “free” augmented reality or a full virtual reality system.
[Jeri] has also posted a video on her YouTube channel detailing the entire saga of castAR’s development (embedded after the jump). The video has a real “heart to heart” feel to it, and is definitely worth watching. The story starts with the early days (and late nights) [Rick] and [Jeri] spent at Valve. She goes through the split with Valve and how the two set up a lab in [Rick’s] living room. [Jeri] also outlines some of the technical aspects of the system. She explains how the optics have been reduced from several pounds of projectors to the mere ounces we see today.
Another surprise addition is the lower level tier rewards of the campaign. The castAR tracking system is offered. The campaign page says the tracking system can be mounted to anything from robots to other VR headsets. The possibilities for hacking are almost endless. We’re curious about setting up our own swarm of quadcopters similar to the UPENN Grasp Lab. The RFID tracking grid is also offered as a separate option. In the gaming system this will be used for tracking tabletop game pieces. Based upon the Kickstarter page, it sounds as if the grid will not only use RFID, but a camera based tracking system. We’re definitely curious what possibilities this will hold.
As of this writing, the castAR Kickstarter campaign is already well past the halfway mark on its way to a $400,000 USD goal.
Continue reading “CastAR Goes Live on Kickstarter”
One of our tipsters led us onto a very cool project by a British university team — It’s called Eidos, and it’s a real-time audio and visual augmentation system.
The creators embarked on this design journey after wondering if there was a way they could control and tune their senses. Imagine Superman and his ability to pick out one voice out of thousand — with this technology, it could be possible.
The clunky white goggles shown in the image above is the concept behind the visual augmentation. It’s akin to long-exposure photography, except that it is in real-time and is fluid video. We’re not sure how this could help anyone, but we have to admit it would be pretty cool to play around with. Maybe if Google Glass ever came out someone could write an app for it to mimic this!
The second device can target your hearing to a specific person in a noisy environment, zoning out all the unnecessary distractions. This could be very helpful for people suffering from attention deficit disorders, although we must imagine it would be very strange to get used to. Can you imagine blocking out everything and only looking at a person’s face and listening to their voice?
Unfortunately there is not much information about the actual tech or software behind these devices or if they even in fact work, but the concept was so interesting we just had to share it. Stick around after the break to see a video explanation and demonstration of the proposed technology.
Continue reading “Eidos: Audio/Visual Sensory Augmentation”
If there was one sentence heard over and over at Maker Faire NY, it was “Did you see castAR yet?” The Technical Illusions team was at Maker Faire in full force. [Jeri Ellsworth], [Rick Johnson,] and team brought two demos: the tried and true Jenga simulator, and a newer overhead shooter based on the Unity 3D engine. We didn’t see any earth shattering changes from the previous demos of castAR, as [Jeri] has moved into optimization of the Hardware, and [Rick] toward even more immersive demos of the software. Optimization and preparing for market are considered the “hard yards” of any product design. This is the place where a huge amount of work goes in, but the changes are subtle to the layperson.
In addition to her development of castAR’s ASIC, [Jeri] has been hard at work on the optics. The “old” glasses used a solid plastic optical path. The newer glasses use a hollow path for the twin 720p projectors. This makes them even lighter than the previous generation. Weight on the castAR glasses can’t be overstated. They feel incredibly light. There was no perceptible pressure on the nose or ears when wearing them. Also missing was the motion sickness people often experience with VR. This is because castAR doesn’t replace the user’s vision field, it only augments the vision. Peripheral motion cues are still there, which makes for a much more comfortable experience. Continue reading “castAR comes to Maker Faire NY 2013”