We’re going to go out on a limb here and say that wherever you are now, a quick glance around will probably reveal at least one LED. They’re everywhere – we can spot a quick half dozen from our desk, mostly acting as pilot lights and room lighting. In those contexts, LEDs are pretty mundane. But what if a little more flash could be added to the LEDs of the world – literally?
That’s the idea behind LightAnchors, which bills itself as a “spatially-anchored augmented reality interface.” LightAnchors comes from work at [Chris Harrison]’s lab at Carnegie Mellon University which seeks new ways to interface with computers, and leverages the ubiquity of LED point sources and the high-speed cameras on today’s smartphones. LightAnchors are basically beacons of digitally encoded data that a smartphone can sense and decode. The target LED is modulated using amplitude-shift keying and each packet contains a data payload and parity bits along with a pre- and post-amble sequence. Software on the phone uses the camera to isolate the point source, track it, and pull the data out of it, which is used to create an overlay on the scene. The video below shows a number of applications, ranging from displaying guest login credentials through the pilot lights on a router to modulating the headlights of a rideshare vehicle so the next fare can find the right car.
An academic paper (PDF link) goes into greater depth on the protocol, and demo Arduino code for creating LightAnchors is thoughtfully provided. It strikes us that the two main hurdles to adoption of LightAnchors would be convincing device manufacturers to support them, and advertising the fact that what looks like a pilot light might actually be something more, but the idea sure beats fixed markers for AR tracking.
I love the simplicity of the concept,
however… I’m also worried if this ever catches on and things get “displays” (the blinking LED) you can only be read with the right equipment (a modern smartphone with the right app).
This could mean that technology has advanced another step… first the physical buttons were replaced by awkward (sometimes unresponsive) touch panel interfaces. And now itself displays could become obsolete because the smartphone now acts like the display.
Basically it’s a lot like IRDA, but only in onde direction. And this is all made possible by the huge advantages in phone camera technology (high res, high frame rate). Funny how things go. The video shows visual light, but why bother, make it IR, using existing lighting is a nice idea, but having normal lights blinking all the time can be annoying eventually only to save a simple IR LED or to increase range (to find the correct car). Can’t wait before someone figures out the phone hasd an LED too and that it can flash back to the device it’s looking at, now it’s bi-directional.
Still… a nice concept, wonder how it goes from here
>I love the simplicity of the concept,
as opposed to a sticker/display with price on a piece of equipment? simplicity indeed!
That sticker can’t change pricing based on demand, serve advertising or track you.
” Can’t wait before someone figures out the phone hasd an LED too and that it can flash back to the device it’s looking at, now it’s bi-directional.”
The only LED in that direction is the Flash, and I’m not sure it’s up to that.
I’d be willing to bet that most phones modulate the perceived light intensity of the flash by pulse width modulation, in which case they’re very much up to that.
Plus also the device capturing the phone’s LED (flash) will not have a high speed camera.
Wait, so you’re telling me I can pop up any message on a smartphone display just by having a modulated LED in view…
sounds like fun…
The idea itself sounds good. The abuse/fun you could have with sounds good to. I can really really really see not wanting to run an app on my phone that will take a modulated source of light and do something on my phone on the basis of it…
Sounds like another method for bad actors to exfiltrate data from a target machine. I understand the use case here, but it could be misused.
That vulnerability may have already existed for some machines, when an LED was triggered by a data pin. I believe PC keyboard lights can be controlled also, not sure what data rate you’d manage through them.
However, if this was gonna catch on, QR codes would be more ubiquitous, because they can carry a small amount of text payload, not only URLs… plus it’s immediately obvious that you should show it to your camera, how many LEDs you gonna point your phone at just in case. Then you’ll get jaded by the one in 50 hit rate and not bother every again, until they have large notices right beside them telling you what they’ll tell you, and you may as well just have printed the short message on that.
https://arxiv.org/pdf/1907.05851.pdf
This study was able to accomplish around 3 kilobits/s (per LED used) with an actual IR sensor, and > 120bit/s with a decent smartphone camera (and using all three LEDs).
Yep, unless it goes the direction of ‘always on / monitoring’ but between obnoxious and providing options for bad actors, also seems like a no-go. Short of purpose-specific for an AR app, doesn’t seem to have a lot of good real use case. Kind of like the thing a while back about encoding wifi passwords into audio playing in a location. Interesting concept, but sorry…
Just like everything else I guess. I think they wanted to go for something that is “like” a QR code, but does not visually disturb you. Plus is dynamic and can change. Would be cool to see it integrated into the camera like a QR code reader and show a small notification. Then I guess its the users choice to see it or not.
It is very interesting. I experimented with social tags both in visible and IR for camera sensing. Once the ID was received, bluetooth and/or a server could be used to query more about the person. Still too slow and inconsistent to do more than a prototype back then. Their POC does seem to an improvement.
Reminds me of a spy hack from the ’80s. GTE had just introduced business phones that used LEDs instead of incandescent bulbs in the line selector buttons. By rewiring the phone to modulate the LED with the audio, you could listen to the phone with a photosensor in line of sight of the LED, which could even be looking in through a window with a telescope.
LED throwies are about to get a lot more interesting!
reminds me of Li-Fi and HDD led side-channels
Panasonic has been doing this for years – they call it LinkRay, and they’re using the LED backlight of LCD displays.
https://panasonic.net/cns/LinkRay/
A rather small list of verified hardware.
There was some databank watches back in the day that could be updated by screen blinking.
Sometimes it surprises me how a group of 5 researchers can do what a Fortune 500 company did!
Seems to me that this might just end up being the next QR code. Kinda nifty in concept, but gets used for nothing but ads, so nobody’s interested.
I’ve seen QR used as part of authentication.
Remember bokade https://hackaday.com/2009/08/29/bokode-a-new-barcode/
I do, actually! Super cool, but resolution limited.
I think the big promise out of that one is that the image looks different depending on where you’re looking from. If you knew where the beacons were, you could probably triangulate based on it.
This also looks super cool. I’m always surprised that more information isn’t translated over modulated light. LiFi, etc.
The two things here that are hacker friendly are the beacon-finding algorithm and the fact that cameras are so cheap. If you don’t mind a low bitrate, you could probably do this with one of the cheapo camera + ESP32 boards.
I’ll write it up if anyone undertakes it!
If you have a rear-view or lane-change camera, you’ve probably figured out that newer driving lights and even headlights are LED, and modulated (the PWM brightness control flickers on the camera display). This could be used to communicate between modern cars… for whatever reason. Primarily, I hope, so I can tell other drivers to f-off.