[miko_tarik] wearing diy AR goggles in futuristic setting

Pi Zero To AR: Building DIY Augmented Reality Glasses

If you’re into pushing tech boundaries from home, this one’s for you. Redditor [mi_kotalik] has crafted ‘Zero’, a custom pair of DIY augmented reality (AR) glasses using a Raspberry Pi Zero. Designed as an affordable, self-contained device for displaying simple AR functions, Zero allows him to experiment without breaking the bank. With features like video playback, Bluetooth audio, a teleprompter, and an image viewer, Zero is a testament to what can be done with determination and creativity on a budget. The original Reddit thread includes videos, a build log, and links to documentation on X, giving you an in-depth look into [mi_kotalik]’s journey. Take a sneak peek through the lens here.

[miko_tarik] wearing diy AR gogglesCreating Zero wasn’t simple. From designing the frame in Tinkercad to experimenting with transparent PETG to print lenses (ultimately switching to resin-cast lenses), [mi_kotalik] faced plenty of challenges. By customizing SPI displays and optimizing them to 60 FPS, he achieved an impressive level of real-time responsiveness, allowing him to explore AR interactions like never before. While the Raspberry Pi Zero’s power is limited, [mi_kotalik] is already planning a V2 with a Compute Module 4 to enable 3D rendering, GPS, and spatial tracking.

Zero is an inspiring example for tinkerers hoping to make AR tech more accessible, especially after the fresh news of both Meta and Apple cancelling their attempts to venture in the world of AR. If you are into AR and eager to learn from an original project like this one, check out the full Reddit thread and explore Hackaday’s past coverage on augmented reality experiments.

Continue reading “Pi Zero To AR: Building DIY Augmented Reality Glasses”

Meta Cancels Augmented Reality Headset After Apple Vision Pro Falls Flat

The history of consumer technology is littered with things that came and went. For whatever reason, consumers never really adopted the tech, and it eventually dies. Some of those concepts seem to persistently hang on, however, such as augmented reality (AR). Most recently, Apple launched its Vision Pro ‘mixed reality’ headset at an absolutely astounding price to a largely negative response and disappointing sale numbers. This impending market flop seems to now have made Meta (née Facebook) reconsider bringing a similar AR device to market.

To most, this news will come as little of a surprise, considering that Microsoft’s AR product (HoloLens) explicitly seeks out (government) niches with substantial budgets, and Google’s smart glasses have crashed and burned despite multiple market attempts. In a consumer market where virtual reality products are already desperately trying not to become another 3D display debacle, it would seem clear that amidst a lot of this sci-fi adjacent ‘cool technology,’ there are a lot of executives and marketing critters who seem to forego the basic question: ‘why would anyone use this?’

Continue reading “Meta Cancels Augmented Reality Headset After Apple Vision Pro Falls Flat”

Dr. Niels Olson uses the Augmented Reality Microscope. (Credit: US Department of Defense)

Google’s Augmented Reality Microscope Might Help Diagnose Cancer

Despite recent advances in diagnosing cancer, many cases are still diagnosed using biopsies and analyzing thin slices of tissue underneath a microscope. Properly analyzing these tissue sample slides requires highly experienced and skilled pathologists, and remains subject to some level of bias. In 2018 Google announced a convolutional neural network (CNN) based system which they call the Augmented Reality Microscope (ARM), which would use deep learning and augmented reality (AR) to assist a pathologist with the diagnosis of a tissue sample. A 2022 study in the Journal of Pathology Informatics by David Jin and colleagues (CNBC article) details how well this system performs in ongoing tests.

For this particular study, the LYmph Node Assistant (LYNA) model was investigated, which as the name suggests targets detecting cancer metastases within lymph node biopsies. The basic ARM setup is described on the Google Health GitHub page, which contains all of the required software, except for the models which are available on request. The ARM system is fitted around an existing medical-grade microscope, with a camera feeding the CNN model with the input data, and any relevant outputs from the model are overlaid on the image that the pathologist is observing (the AR part).

Although the study authors noted that they saw potential in the technology, as with most CNN-based systems a lot depends on how well the training data set was annotated. When a grouping of tissue including cancerous growth was marked too broadly, this could cause the model to draw an improper conclusion. This makes a lot of sense when one considers that this system essentially plays ‘cat or bread’, except with cancer.

These gotchas with recognizing legitimate cancer cases are why the study authors see it mostly as a useful tool for a pathologist. One of the authors, Dr. Niels Olsen, notes that back when he was stationed at the naval base in Guam, he would have liked to have a system like ARM to provide him as one of the two pathologists on the island with an easy source of a second opinion.

(Heading image: Dr. Niels Olson uses the Augmented Reality Microscope. (Credit: US Department of Defense) )

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

A PCB with several points highlighted by a projection system

Augmented Reality Workbench Helps You To Debug Your Boards

No matter how advanced your design skills, the chances are you’ll need to spend some time chasing bugs in your boards after they come back from the assembly house. Testing and debugging a PCB typically involves a lot of cross-checking between the board, the layout and the schematic, which quickly becomes tiresome even for mildly complex designs. To make this task a bit easier, [Ishan Chatterjee] and colleagues at the University of Washington have designed the Augmented Reality Debugging Workbench, or ARDW for short.

The ARDW is a setup consisting of a lab workbench with an antistatic mat, a selection of measurement instruments and a PC. You can simply place your board on the bench, open the schematic and layout in KiCAD and start measuring and debugging your design as you normally would, but the real magic happens when you select a new icon in KiCAD that exports the schematic and layout to the ARDW system. From that moment, you can select components in your schematic and have them highlighted not only on the layout, but on the physical board in front of you as well. This is perhaps best demonstrated visually, as the team members do in the video embedded below.

The real-life highlighting of components is achieved thanks to a set of cameras that track the motion of everything on the desk as well as a video projector that overlays information on top of the PCB. All of this enables a variety of useful debugging features: for example, there’s an option to highlight pin one on all components, enabling a simple visual check of each component’s orientation. You can select all Do Not Populate (DNP) instances and immediately see if all highlighted pads are empty. If you’re not sure which component you’re looking at, just point at it with your multimeter probe and it’s highlighted on the schematic and layout. You can even place your probes on a net and automatically log the voltage for future reference, thanks to a digital link between the multimeter and the ARDW software.

In addition to designing and building the ARDW, the team also performed a usability study using a group of human test subjects. They especially liked the ability to quickly locate components on crowded boards, but found the on-line measurement system a bit cumbersome due to its limited positional accuracy. Future work will therefore focus on improving the resolution of the projected image and generally making the system more compact and robust. All software is freely available on the project’s GitHub page, and while the current system looks a little complex for hobbyist use, we can already imagine it being a useful tool in production environments.

It’s not even the first time augmented reality has been used for PCB debugging: we saw a somewhat similar system at the 2019 Hackaday Superconference. AR can also come in handy during the design and prototyping phase, as demonstrated by this AR breadboard.

Continue reading “Augmented Reality Workbench Helps You To Debug Your Boards”

Laser Augmented Reality Glasses Show You The Way

Tech companies like Google and Microsoft have been working on augmented reality (AR) wearables that can superimpose images over your field of view, blurring the line between the real and virtual. Unfortunately for those looking to experiment with this technology, the devices released so far have been prohibitively expensive.

While they might not be able to compete with the latest Microsoft HoloLens, these laser AR classes from [Joel] promise to be far cheaper and much more approachable for hackers. By bouncing a low-power laser off of a piezo-actuated mirror, the hope is that the glasses will be able to project simple vector graphics onto a piece of reflective film usually used for aftermarket automotive heads-up displays (HUDs).

Piezo actuators are used to steer the mirror.

[Joel] has put together a prototype of what the mirror system might look like, but says driving the high-voltage piezo actuators poses some unique challenges. The tentative plan is to generate the vector data with a smartphone application, send it to an ESP32 microcontroller within the glasses, and then push the resulting analog signals through a 100 V DC-DC boost converter to get the mirror moving.

We’ve seen the ESP32 drive a laser galvanometer to play a game of Asteroids, but recreating such a setup in a small enough package to fit onto a pair of glasses would certainly be an impressive accomplishment. Early tests look promising, but clearly [Joel] has quite a bit of work ahead of him. As a finalist for the Rethink Displays challenge of the 2021 Hackaday Prize, we’re looking forward to seeing the project develop over the coming months.

Augmented Reality On The Cheap With ESP32

Augmented reality (AR) technology hasn’t enjoyed the same amount of attention as VR, and seriously lags in terms of open source development and accessibility.  Frustrated by this, [Arnaud Atchimon] created CheApR, an open source, low cost AR headset that anyone can build at home and use as a platform for further development

[Arnaud] was impressed by the Tilt Five AR goggles, but the price of this cutting edge hardware simply put it out of reach of most people. Instead, he designed and built his own around a 3D printed frame, ESP32, cheap LCDs, and lenses from a pair of sunglasses. The electronics is packed horizontally in the top of the frame, with the displays pointed down into a pair of angled mirrors, which reflect the image onto the sunglasses lenses and into the user’s eyes. [Arnaud] tested a number of different lenses and found that a thin lens with a slight curve worked best. The ESP32 doesn’t actually run the main software, it just handles displaying the images on the LCDs. The images are sent from a computer running software written in Processing. Besides just displaying images, the software can also integrate inputs from a MPU6050 IMU and  ESP32 camera module mounted on the goggles. This allows the images to shift perspective as the goggles move, and recognize faces and AR markers in the environment.

All the design files and software is available on GitHub, and we exited to see where this project goes. We’ve seen another pair of affordable augmented reality glasses that uses a smartphone as a display, but it seems the headset that was used are no longer available.