There are many in the hacker community who would love to experiment with augmented reality (AR), but the hardware landscape isn’t exactly overflowing with options that align with our goals and priorities. Commercial offerings, from Google’s Glass to the Microsoft HoloLens and Magic Leap 2 are largely targeting medical and aerospace customers, and have price tags to match. On the hobbyist side of the budgetary spectrum we’re left with various headsets that let you slot in a standard smartphone, but like their virtual reality (VR) counterparts, they can hardly compare with purpose-built gear.
But there’s hope — Brilliant Labs are working on AR devices that tick all of our boxes: affordable, easy to interface with, and best of all, developed to be as open as possible from the start. Admittedly their first product, Monocle, it somewhat simplistic compared to what the Big Players are offering. But for our money, we’d much rather have something that’s built to be hacked and experimented with. What good is all the latest features and capabilities when you can’t even get your hands on the official SDK?
This week we invited Brilliant Lab’s Head of Engineering Raj Nakaraja to the Hack Chat to talk about AR, Monocle, and the future of open source in this space that’s dominated by proprietary hardware and software.
In the world of hardware hacking, you sometimes spend a ridiculous amount of time debugging a problem, only to find a simple solution that was right in front of you the whole time. [Zack Freedman] got a good dose of this while building the Optigon V2, a modified Epson Moverio wearable display he uses as a teleprompter in all his videos. He prefers having the teleprompter over his left eye only, but the newer version of the Moverio would shut off both sides if one is disconnected, so [Zack] needed a workaround.
Looking for some help from above, [Zack] requested developer documentation for the display module from Epson, but got declined because he wasn’t a manufacturer or product developer. Luckily, a spec sheet available for downloaded from the Epson website did contain a lot of the information he needed. An STM32 monitored the temperature of each display module over a pair of independent I2C interfaces, and would shut down everything if it couldn’t connect to either. This led [Zack] to attempt to spoof the I2C signals with an ATmega328, but it couldn’t keep up with the 400 kHz I2C bus.
However, looking at the logs from his logic analyzer, [Zack] found that the STM32 never talked to both display modules simultaneously, even though it is capable of doing so. Both displays use the same I2C address, so [Zack] could simply connect the two I2C buses to each other with a simple interface board, effectively making the left display “spoof” the signals from the right display.
Whether it was rays from the Sun that made a 150 million kilometer trip just to ruin your day or somebody’s unreasonably bright aftermarket headlights, at some point or another we’ve all experienced the discomfort of bright spots in our eyes. But short of wearing welder’s goggles all the time, what can we do? Luckily for us, [Nick Bild] has come up with a solution. Sort of.
By adding LCDs to a pair of standard sunglasses, [Nick] has created something he’s calling “Light Brakes”. The idea is that the LCDs, having their backings removed, can essentially be used as programmable shutters to block out a specific part of the image that’s passing through them. With the addition of a Raspberry Pi and a camera, the Light Brakes can identify an unusually bright source of light and block it from the wearer’s vision by drawing a sufficiently large blob on the LCDs.
At least, that’s the idea. As you can see in the video after the break, the LCDs ability to block out a moving source of light is somewhat debatable. It’s also unclear what, if any, effect the “blocking” would have on UV, so you definitely shouldn’t try looking at the sun with a pair of these.
That said, a refined version of the concept could have some very interesting applications. For instance, imagine a pair of glasses that could actively block out advertisements or other unpleasant images from your field of vision. If this all sounds a bit like something out of an episode of Black Mirror, that’s because it is.
A University of Utah team have a working prototype of a new twist on fluid-filled lenses for correction of vision problems: automatic adjustment and refocus depending on what you’re looking at. Technically, the glasses have a distance sensor embedded into the front of the frame and continually adjust the focus of the lenses. An 8 gram, 110 mAh battery powers the prototype for roughly 6 hours.
Eyeglasses that can adapt on the fly to different focal needs is important because many people with degraded vision suffer from more than one condition at the same time, which makes addressing their vision problems more complex than a single corrective lens. For example, many people who are nearsighted or farsighted (where near objects and far objects far objects and near objects are seen out of focus, respectively) also suffer from a general loss of the eye’s ability to change focus, a condition that is age-related. As a result, people require multiple sets of eyeglasses for different conditions. Bifocal or trifocal or progressive lenses are really just multiple sets of lenses squashed into a smaller form factor, and greatly reduce the wearer’s field of view which is itself a significant vision impairment. A full field of view could be restored if eyeglass lenses were able to adapt to different needs based on object distance, and that is what this project achieves.