Augmented reality (AR) technology hasn’t enjoyed the same amount of attention as VR, and seriously lags in terms of open source development and accessibility. Frustrated by this, [Arnaud Atchimon] created CheApR, an open source, low cost AR headset that anyone can build at home and use as a platform for further development
[Arnaud] was impressed by the Tilt Five AR goggles, but the price of this cutting edge hardware simply put it out of reach of most people. Instead, he designed and built his own around a 3D printed frame, ESP32, cheap LCDs, and lenses from a pair of sunglasses. The electronics is packed horizontally in the top of the frame, with the displays pointed down into a pair of angled mirrors, which reflect the image onto the sunglasses lenses and into the user’s eyes. [Arnaud] tested a number of different lenses and found that a thin lens with a slight curve worked best. The ESP32 doesn’t actually run the main software, it just handles displaying the images on the LCDs. The images are sent from a computer running software written in Processing. Besides just displaying images, the software can also integrate inputs from a MPU6050 IMU and ESP32 camera module mounted on the goggles. This allows the images to shift perspective as the goggles move, and recognize faces and AR markers in the environment.
In a sign of the times, the Federal Communications Commission has officially signed off on remote testing sessions for amateur radio licensing in the United States. Testing in the US is through the Volunteer Examiner Coordinator program, which allows teams of at least three Volunteer Examiners to set up in-person testing sessions where they proctor amateur radio licensing exams. The VEs take their jobs very seriously and take pride in offering exam sessions on a regular schedule, so when social distancing rules made their usual public testing venues difficult to access, many of them quickly pivoted to remote testing using teleconferencing applications. Here’s hoping that more VEs begin offering remote testing sessions.
Another aspect of life changed by COVID-19 and social distancing rules has been the simple pleasure of a trip to the museum. And for the museums themselves, the lack of visitors can be catastrophic, both in terms of fulfilling their educational and research missions and through the lack of income that results. To keep the flame alive in a fun way, Katrina Bowen from The Centre for Computing History in Cambridge has recreated her museum in loving detail in Animal Crossing: New Leaf. For being limited to what’s available in the game, Katrina did a remarkable job on the virtual museum; we especially like the Megaprocessor wallpaper. She even managed to work in that staple last stop of every museum, the gift shop.
To the surprise of few, “spatial computing” startup Magic Leap has announced that it is laying off half its workforce as it charts a new course. The company, which attracted billions in funding based on its virtual retinal display technology, apparently couldn’t sell enough of their Magic Leap One headsets to pay the bills. The company is swiveling to industrial users, which honestly seems like a better application for their retinal display technology than the consumer or gaming markets.
And finally, as if 2020 hasn’t been weird enough already, the Department of Defense has officially released videos of what it calls “unidentified aerial phenomena.” These videos, taken from the head-up displays of US Navy fighter jets, had previously been obtained by private parties and released to the public. Recorded between 2004 and 2015, the videos appear to show objects that are capable of extremely high-speed flight and tight maneuvers close to the surface of the ocean. We find the timing of the release suspicious, almost as if the videos are intended to serve as a distraction from the disturbing news of the day. We want to believe we’re not alone, but these videos don’t do much to help.
For most of human history, the way to get custom shapes and colors onto one’s retinas was to draw it on a cave wall, or a piece of parchment, or on paper. Later on, we invented electronic displays and used them for everything from televisions to computers, even toying with displays that gave the illusion of a 3D shape existing in front of us. Yet what if one could just skip this surface and draw directly onto our retinas?
Admittedly, the thought of aiming lasers directly at the layer of cells at the back of our eyeballs — the delicate organs which allow us to see — likely does not give one the same response as you’d have when thinking of sitting in front of a 4K, 27″ gaming display to look at the same content. Yet effectively we’d have the same photons painting the same image on our retinas. And what if it could be an 8K display, cinema-sized. Or maybe have a HUD overlay instead, like in video games?
In many ways, this concept of virtual retinal displays as they are called is almost too much like science-fiction, and yet it’s been the subject of decades of research, with increasingly more sophisticated technologies making it closer to an every day reality. Will we be ditching our displays and TVs for this technology any time soon?
“Know your enemy” is the essence of one of the most famous quotes from [Sun Tzu]’s Art of War, and it’s as true now as it was 2,500 years ago. It also applies far beyond the martial arts, and as the world squares off for battle against COVID-19, it’s especially important to know the enemy: the novel coronavirus now dubbed SARS-CoV-2. And now, augmented reality technology is giving a boost to search for fatal flaws in the virus that can be exploited to defeat it.
The video below is a fascinating mix of 3D models of viral structures, like the external spike glycoproteins that give coronaviruses their characteristic crown appearance, layered onto live video of [Tom Goddard], a programmer/analysts at the University of California San Francisco. The tool he’s using is called ChimeraX, a molecular visualization program developed by him and his colleagues. He actually refers to this setup as “mixed reality” rather than “augmented reality”, to stress the fact that AR tends to be an experience that only the user can fully appreciate, whereas this system allows him to act as a guide on a virtual tour of the smallest of structures.
Using a depth-sensing camera and a VR headset, [Tom] is able to manipulate 3D models of the SARS virus — we don’t yet have full 3D structure data for the novel coronavirus proteins — to show us exactly how SARS binds to its receptor, angiotensin-converting enzyme-2 (ACE-2), a protein expressed on the cell surfaces of many different tissue types. It’s fascinating to see how the biding domain of the spike reaches out to latch onto ACE-2 to begin the process of invading a cell; it’s also heartening to watch [Tom]’s simulation of how the immune system responds to and blocks that binding.
It looks like ChimeraX and similar AR systems are going to prove to be powerful tools in the fight against not just COVID-19, but in all kinds of infectious diseases. Hats off to [Tom] and his team for making them available to researchers free of charge.
Thus far, the vast majority of human photographic output has been two-dimensional. 3D displays have come and gone in various forms over the years, but as technology progresses, we’re beginning to see more and more immersive display technologies. Of course, to use these displays requires content, and capturing that content in three dimensions requires special tools and techniques. Kim Pimmel came down to Hackaday Superconference to give us a talk on the current state of the art in advanced AR and VR camera technologies.
Kim has plenty of experience with advanced displays, with an impressive resume in the field. Having worked on Microsoft’s Holo Lens, he now leads Adobe’s Aero project, an AR app aimed at creatives. Kim’s journey began at a young age, first experimenting with his family’s Yashica 35mm camera, where he discovered a love for capturing images. Over the years, he experimented with a wide variety of gear, receiving a Canon DSLR from his wife as a gift, and later tinkering with the Stereorealist 35mm 3D camera. The latter led to Kim’s growing obsession with three-dimensional capture techniques.
Through his work in the field of AR and VR displays, Kim became familiar with the combination of the Ricoh Theta S 360 degree camera and the Oculus Rift headset. This allowed users to essentially sit inside a photo sphere, and see the image around them in three dimensions. While this was compelling, [Kim] noted that a lot of 360 degree content has issues with framing. There’s no way to guide the observer towards the part of the image you want them to see.
Mihir Shah has designed many a PCB in his time. However, when working through the development process, he grew tired of the messy, antiquated methods of communicating design data with his team. Annotating photos is slow and cumbersome, while sending board design files requires everyone to use the same software and be up to speed. Mihir thinks he has a much better solution by the name of InspectAR, it’s an augmented reality platform that lets you see inside the circuit board and beyond which he demoed during the 2019 Hackaday Superconference.
The idea of InspectAR is to use augmented reality to help work with and debug electronics. It’s a powerful suite of tools that enable the live overlay of graphics on a video feed of a circuit board, enabling the user to quickly and effectively trace signals, identify components, and get an idea of what’s what. Usable with a smartphone or a webcam, the aim is to improve collaboration and communication between engineers by giving everyone a tool that can easily show them what’s going on, without requiring everyone involved to run a fully-fledged and expensive electronics design package.
The Supercon talk served to demonstrate some of the capabilities of InspectAR with an Arduino Uno. With a few clicks, different pins and signals can be highlighted on the board as Mihir twirls it between his fingers. Using ground as an example, Mihir first highlights the entire signal. This looks a little messy, with the large ground plane making it difficult to see exactly what’s going on. Using an example of needing a point to attach to for an oscilloscope probe, [Mihir] instead switches to pad-only mode, clearly revealing places where the user can find the signal on bare pads on the PCB. This kind of attention to detail shows the strong usability ethos behind the development of InspectAR, and we can already imagine finding it invaluable when working with unfamiliar boards. There’s also the possibility to highlight different components and display metadata — which should make finding assembly errors a cinch. It could also be useful for quickly bringing up datasheets on relevant chips where necessary.
Obviously, the electronic design space is a fragmented one, with plenty of competing software in the market. Whether you’re an Eagle diehard, Altium fanatic, or a KiCad fan, it’s possible to get things working with InspectAR. Mihir and the team are currently operating out of office space courtesy of Autodesk, who saw the value in the project and have supported its early steps. The software is available free for users to try, with several popular boards available to test. As a party piece for Supercon, our very own Hackaday badge is available if you’d like to give it a spin, along with several Arduino boards, too. We can’t wait to see what comes next, and fully expect to end up using InspectAR ourselves when hacking away at a fresh run of boards!
Retroreflectors are interesting materials, so known for their nature of reflecting light back to its source. Examples include street signs, bicycle reflectors, and cat’s eyes, which so hauntingly pierce the night. They’re also used in the Tilt Five tabletop AR system, for holographic gaming. [Adam McCombs] got his hands on a Tilt Five gameboard, and threw it under the microscope to see how it works.
[Adam] isn’t mucking around, fielding a focused ion beam microscope for the investigation. This scans a beam of galium metal ions across a sample for imaging. With the added kinetic energy of an ion beam versus a more typical electron beam, the sample under the microscope can be ablated as well as imaged. This allows [Adam] to very finally chip away at the surface of the retroreflector to see how it’s made.
The analysis reveals that the retroreflecting spheres are glass, coated in metal. They’re stuck to a surface with an adhesive, which coats the bottom of the spheres, and acts as an etch mask. The metal coating is then removed from the sphere’s surface sticking out above the adhesive layer. This allows light to enter through the transparent part of the sphere, and then bounce off the metal coating back to the source, creating a sheet covered in retroreflectors.