Head-Up Display Augments Bionic Turtle’s Reality

There’s a harsh truth underlying all robotic research: compared to evolution, we suck at making things move. Nature has a couple billion years of practice making things that can slide, hop, fly, swim and run, so why not leverage those platforms? That’s the idea behind this turtle with a navigation robot strapped to its back.

This reminds us somewhat of an alternative universe sci-fi story by S.M. Stirling called The Sky People.  In the story, Venus is teeming with dinosaurs that Terran colonists use as beasts of burden with brain implants that stimulate pleasure centers to control them. While the team led by [Phill Seung-Lee] at the Korean Advanced Institute of Science and Technology isn’t likely to get as much work from the red-eared slider turtle as the colonists in the story got from their bionic dinosaurs, there’s still plenty to learn from a setup like this. Using what amounts to a head-up display for the turtle in the form of a strip of LEDs, along with a food dispenser for positive reinforcement, the bionic terrapin is trained to associate food with the flashing LEDs. The LEDs are then used as cues as the turtle navigates between waypoints in a tank. Sadly, the full article is behind a paywall, but the video below gives you a taste of the gripping action.

Looking for something between amphibian and fictional dinosaurs to play mind games with? Why not make your best friend bionic? Continue reading “Head-Up Display Augments Bionic Turtle’s Reality”

The ‘All-Seeing Pi’ Aids Low-Vision Adventurer

Adventure travel can be pretty grueling, what with the exotic locations and potential for disaster that the typical tourist destinations don’t offer. One might find oneself dangling over a cliff for that near-death-experience selfie or ziplining through a rainforest canopy. All this is significantly complicated by being blind, of course, so a tool like this Raspberry Pi low-vision system would be a welcome addition to the nearly-blind adventurer’s well-worn rucksack.

[Dan] has had vision problems since childhood, but one look at his YouTube channel shows that he doesn’t let that slow him down. When [Dan] met [Ben] in Scotland, [Ben] noticed that he was using his smartphone as a vision aid, looking at the display up close and zooming in to get as much detail as possible from his remaining vision. [Ben] thought he could help, so he whipped up a heads-up display from a Raspberry Pi and a Pi Camera. Mounted to a 3D-printed frame holding a 5″ HDMI display and worn from a GoPro head mount, the camera provides enough detail to help [Dan] navigate, as seen in the video below.

The rig is a bit unwieldy right now, but as proof of concept (and proof of friendship), it’s a solid start. We think a slimmer profile design might help, in which case [Ben] might want to look into this Google Glass-like display for a multimeter for inspiration on version 2.0.

Continue reading “The ‘All-Seeing Pi’ Aids Low-Vision Adventurer”

Video With Sensor Data Overlay Via Arduino Mega

If you haven’t been paying attention, big wheel trikes are a thing. There are motor driven versions as well as OG pedal pushing types . [Flux Axiom] is of the OG (you only get one link, now its on you) flavor and has written an instructable that shows how to achieve some nice looking on screen data that he syncs up with the video for a professional looking finished product which you can see in the video after the break.

[Flux Axiom] is using an Arduino Mega in his setup along with a cornucopia of sensors and all their data is being logged onto an SD card. All the code used in his setup is available in his GitHub repository. [Flux Axiom] was also nice enough to include the calibration process he used for the sensors which is also located in the GitHub download.

Sadly [Flux Axiom] uses freedom hating software for combining the video and data, Race Render 3 is his current solution and he is pleased with the results. Leave it in the comments if you have an open source solution for combining the video and data that we can offer him as a replacement.

Edit: Correct spelling of handle.

Continue reading “Video With Sensor Data Overlay Via Arduino Mega”

ASTROGUN Is Like Asteroids On Steroids

Astrogun

As the Jerusalem mini Makerfaire approached, [Avishay] had to come up with something to build. His final project is something he calls ASTROGUN. The ASTROGUN is a sort of augmented reality game that has the player attempting to blast quickly approaching asteroids before being hit.

It’s definitely reminiscent of the arcade classic, Asteroids. The primary difference is that the player has no space ship and does not move through space. Instead, the player has a first person view and can rotate 360 degrees and look up and down. The radar screen in the corner will give you a rough idea of where the asteroids are coming from. Then it’s up to you to actually locate them and blast them into oblivion before they destroy you.

The game is built around a Raspberry Pi computer. This acts as the brains of the operation. The Pi interfaces with an MPU-9150 inertial measurement unit (IMU). You commonly see IMU’s used in drones to help them keep their orientation. In this case, [Avishay] is using it to track the motion and orientation of the blaster. He claims nine degrees of freedom with this setup.

The Pi generates the graphics and sends the output to a small, high-brightness LCD screen. The screen is mounted perpendicular to the player’s view so the screen is facing “up”. There is a small piece of beam splitting glass mounted above the display at approximately a 45 degree angle. This is a special kind of glass that is partially reflective and partially translucent. The result is that the player sees the real-world background coming through the glass, with the digital graphics overlaid on top of that. It’s similar to some heads-up display technologies.

All of the electronics fit either inside or mounted around a toy gun. The display system was attached with a custom-made fiberglass mount. The code appears to be available via Github. Be sure to watch the video of the system in action below. Continue reading “ASTROGUN Is Like Asteroids On Steroids”

Controlling The Garmin HUD With Bluetooth

HUD

The Garmin HUD is a very neat device, putting all your navigational info, from ETA, what lane you should be in, and distance to your next turn right on your windscreen in a heads-up display. The only problem with the Garmin HUD is that it only works with the official Garmin app, despite being a Bluetooth device. Now, someone is finally digging in to the Garmin HUD protocol, allowing anyone to control this HUD from a cell phone, tablet, or computer.

Being completely unable to disassemble the Navigon app for the HUD, [gabonator] decided the only thing to do would be to open up the device and take a peek at some of the packets travelling between the microcontroller and bluetooth module.

[gabonator] expected human readable ASCII characters, but after looking at the nonsense decoded from his oscilloscope and decoding them manually, he tried simply looking at the display in operation to understand how the protocol worked. He got it all decoded, and managed to get a Sygic Navigation program working with this Garmin HUD. You can check out a video of that below.

Thanks [Kevin] for the tip.

Continue reading “Controlling The Garmin HUD With Bluetooth”

Snowboard Goggle HUD Displays Critical Data While Falling Down A Mountain

snowboard-google-hud

[Chris] has been hard at work building a Heads Up Display into some Snowboarding goggles. We’re used to seeing the components that went into the project, but the application is unexpected. His own warning that the display is too close to your face and could cause injury if you were to fall highlights the impractical nature of the build. But hey, you’ve got to start somewhere when it comes to prototyping. Perhaps the next iteration will be something safe to use.

A set of MyVu glasses were added to the top portion of the goggles, which lets the wearer view the LCD output by looking slightly up. The display is fed by a Raspberry Pi board which connects to a GPS module, all of which is powered by a USB backup battery. In the video after the break you can see that the display shows time of day, speed, altitude, and temperature (although he hasn’t got a temperature sensor hooked up just yet). His bill of materials puts the project cost at about £160 which is just less that $250.

Continue reading “Snowboard Goggle HUD Displays Critical Data While Falling Down A Mountain”

Projecting Video Directly Onto The Retina

With the head-mountable, augmented reality Google Glass capturing tons of attention in the press, it was only a matter of time before we saw a DIY retina projector. This isn’t a new build; [Nirav] has been working on it for a few months, but it might just be time for this information to be useful to someone.

A retina projector focuses laser light though beam splitters and concave mirrors to create a raster display on the back of your eye. There’s an incredible amount of research into this field, but not many DIY projects. To make this project a reality, [Nirav] picked up a SHOWWX laser video projector and mounted it in a 3D printed frame along with a few pieces of optical equipment.

[Nirav]’s build isn’t without its drawbacks, though. The exit pupil, or the apparent size of the image, is only about 1.5 mm wide and much too small to be of any real use. Also, commercial retina projectors have an output of a puny 2 microwatts, where [Nirav]’s laser projector puts out 200 millwatts. This is more than enough to permanently damage your eye.