There are a few very different pathways to building a product, and we gotta applaud the developers taking care to take the open-source path. Today’s highlight is [Mentra], who is releasing an open-source smart glasses OS for their own and others’ devices, letting you develop your smart glasses ideas just once, a single codebase applicable for multiple models.
Currently, the compatibility list covers four models, two of them Mentra’s (Live and Mach 1), one from Vuzix (Z100), and one from Even Realities (G1) — some display-only, and some recording-only. The app store already has a few apps that cover the basics, the repository looks lively, and if the openness is anything to go by, our guess is that we’re sure to see more.
[NullPxl]’s Ban-Rays concept is a wearable that detects when one is in the presence of camera-bearing smartglasses, such as Meta’s line of Ray-Bans. A project in progress, it’s currently focused on how to reliably perform detection without resorting to using a camera itself. Right now, it plays a well-known audio cue whenever it gets a hit.
Once software is nailed down, the device aims to be small enough to fit into glasses.
Currently, [NullPxl] is exploring two main methods of detection. The first takes advantage of the fact that image sensors in cameras act as tiny reflectors for IR. That means camera-toting smartglasses have an identifying feature, which can be sensed and measured. You can see a sample such reflection in the header image, up above.
As mentioned, Ban-Rays eschews the idea of using a camera to perform this. [NullPxl] understandably feels that putting a camera on glasses in order to detect glasses with cameras doesn’t hold much water, conceptually.
The alternate approach is to project IR in a variety of wavelengths while sensing reflections with a photodiode. Initial tests show that scanning a pair of Meta smartglasses in this way does indeed look different from regular eyeglasses, but probably not enough to be conclusive on its own at the moment. That brings us to the second method being used: wireless activity.
Characterizing a device by its wireless activity turned out to be trickier than expected. At first, [NullPxl] aimed to simply watch for BLE (Bluetooth Low-Energy) advertisements coming from smartglasses, but these only seem to happen during pairing and power-up, and sometimes when the glasses are removed from the storage case. Clearly a bit more is going to be needed, but since these devices rely heavily on wireless communications there might yet be some way to actively query or otherwise characterize their activity.
This kind of project is something that is getting some interest. Here’s another smartglasses detector that seems to depend entirely on sniffing OUIs (Organizationally Unique Identifiers); an approach [NullPxl] suspects isn’t scalable due to address randomization in BLE. Clearly, a reliable approach is still in the works.
The increasing numbers of smartglasses raises questions about the impact of normalizing tech companies turning people into always-on recording devices. Of course, the average person is already being subtly recorded by a staggering number of hidden cameras. But at least it’s fairly obvious when an individual is recording you with a personal device like their phone. That may not be the case for much longer.
The Pebble was the smartwatch darling of the early 2010s, a glimpse of the future in the form of a microcontroller and screen strapped to your wrist. It was snapped up by Fitbit and canned, which might have been the end of it all were it not for the dedication of the Pebble community.
Google open-sourced the OS back in January this year, and since then a new set of Pebble products have appeared under the guidance of Pebble creator [Eric Migicovsky]. Now he’s announced the full open-sourcing of the current Pebble hardware and software stack. As he puts it, “Yesterday, Pebble watch software was ~95% open source. Today, it’s 100% open source”.
If you’re curious it can all be found in repositories under the Core Devices GitHub account. Building your own Pebble clone sounds cool, but perhaps the real value lies instead in giving the new Pebbles something the original never had, an assured future. If you buy one of the new watches then you’ll know that it will remain fixable, and since you have the full set of files you can create new parts for it, or update its software. We think that’s the right way to keep a personal electronic device relevant.
The geometric waveguide glass of the Meta Ray-Ban Display glasses. (Credit iFixit)
Recently the avid teardown folk over at iFixit got their paws on Meta’s Ray-Ban Display glasses, for a literal in-depth look at these smart glasses. Along the way they came across the fascinating geometric waveguide technology that makes the floating display feature work so well. There’s also an accompanying video of the entire teardown, for those who enjoy watching a metal box cutter get jammed into plastic.
Overall, these smart glasses can be considered to be somewhat repairable, as you can pry the arms open with a bit of heat. Inside you’ll find the 960 mWh battery and a handful of PCBs, but finding spare parts for anything beyond perhaps the battery will be a challenge. The front part of the glasses contain the antennae and the special lens on the right side that works with the liquid crystal on silicon (LCoS) projector to reflect the image back to your eye.
While LCoS has been used for many years already, including Google Glass, it’s the glass that provides the biggest technological advancement. Instead of the typical diffractive waveguide it uses a geometric reflective waveguide made by Schott, with the technology developed by Lumus for use in augmented reality (AR) applications. This is supposed to offer better optical efficiency, as well as less light leakage into or out of the waveguide.
Although definitely impressive technology, the overall repairability score of these smart glasses is pretty low, and you have to contest with both looking incredibly dorky and some people considering you to be a bit of a glasshole.
Though threading is a old concept in computer science, and fabric computing has been a term for about thirty years, the terminology has so far been more metaphorical than strictly descriptive. [Cedric Honnet]’s FiberCircuits project, on the other hand, takes a much more literal to weaving technology “into the fabric of everyday life,” to borrow the phrase from [Mark Weiser]’s vision of computing which inspired this project. [Cedric] realized that some microcontrollers are small enough to fit into fibers no thicker than a strand of yarn, and used them to design these open-source threads of electronics (open-access paper).
The physical design of the FiberCircuits was inspired by LED filaments: a flexible PCB wrapped in a protective silicone coating, optionally with a protective layer of braiding surrounding it. There are two kinds of fiber: the main fiber and display fibers. The main fiber (1.5 mm wide) holds an STM32 microcontroller, a magnetometer, an accelerometer, and a GPIO pin to interface with external sensors or other fibers. The display fibers are thinner at only one millimeter, and hold an array of addressable LEDs. In testing, the fibers could withstand six Newtons of force and be bent ten thousand times without damage; fibers protected by braiding even survived 40 cycles in a washing machine without any damage. [Cedrik] notes that finding a PCB manufacturer that will make the thin traces required for this circuit board is a bit difficult, but if you’d like to give it a try, the design files are on GitHub.
[Cedrik] also showed off a few interesting applications of the thread, including a cyclist’s beanie with automatic integrated turn signals, a woven fitness tracker, and a glove that senses the wearer’s hand position; we’re sure the community can find many more uses. The fibers could be embroidered onto clothing, or embedded into woven or knitted fabrics. On the programming side, [Cedrik] ported support for this specific STM32 core to the Arduino ecosystem, and it’s now maintained upstream by the STM32duino project, which should make integration (metaphorically) seamless.
Aiming for small scale, [James] began with 6 mm blue phosphor glass tube, which was formed to reference Pink Pony Club, one of Chappell Roan’s more popular songs. The glass was then filled with pure neon up to a relatively low pressure of just 8 torr. This was an intentional choice to create a more conductive lamp that would be easier to run off a battery supply. The use of pure neon also made the tubes easy to repair in the event they had a leak and needed a refill. A Midget Script gas tube power supply is used to drive the tiny tubes from DC power. In testing, the tubes draw just 0.78 amps at 11.8 volts. It’s not a light current draw, but for neon, it’s pretty good—and you could easily carry a battery pack to run it for an hour or three without issue.
If you’re not a glass blower, fear not—you can always make stuff that has a similar visual effect with some LEDs and creativity. Meanwhile, if you’ve got your own neon creations on the go—perhaps for Halloween?—don’t hesitate to light up the tipsline!
There have been lots of haptic vest devices over the years, though the vast majority have been very simple. Many existing suits pack in a few speakers or vibration motors to give feedback to the wearer. Kinethreads aims to go further, serving as a full-body haptic suit using an innovative mechanical setup.
Kinethreads is effectively an exosuit, which mounts several motorized pulley systems to the wearer’s body. These pulleys are attached to the user’s hands, feet, back, torso, and head via strings. By winding in the pulleys, it’s possible for the device to effectively tug on different parts of the body, creating rich, dynamic physical feedback that can easily be felt and interpreted by the user. The whole system weighs 4.6 kilograms—not light, but very practical. It can also run for 6 hours on a single charge. The whole suit can be donned or doffed in under a minute. Cost is stated to be under $500.