Dr. Niels Olson uses the Augmented Reality Microscope. (Credit: US Department of Defense)

Google’s Augmented Reality Microscope Might Help Diagnose Cancer

Despite recent advances in diagnosing cancer, many cases are still diagnosed using biopsies and analyzing thin slices of tissue underneath a microscope. Properly analyzing these tissue sample slides requires highly experienced and skilled pathologists, and remains subject to some level of bias. In 2018 Google announced a convolutional neural network (CNN) based system which they call the Augmented Reality Microscope (ARM), which would use deep learning and augmented reality (AR) to assist a pathologist with the diagnosis of a tissue sample. A 2022 study in the Journal of Pathology Informatics by David Jin and colleagues (CNBC article) details how well this system performs in ongoing tests.

For this particular study, the LYmph Node Assistant (LYNA) model was investigated, which as the name suggests targets detecting cancer metastases within lymph node biopsies. The basic ARM setup is described on the Google Health GitHub page, which contains all of the required software, except for the models which are available on request. The ARM system is fitted around an existing medical-grade microscope, with a camera feeding the CNN model with the input data, and any relevant outputs from the model are overlaid on the image that the pathologist is observing (the AR part).

Although the study authors noted that they saw potential in the technology, as with most CNN-based systems a lot depends on how well the training data set was annotated. When a grouping of tissue including cancerous growth was marked too broadly, this could cause the model to draw an improper conclusion. This makes a lot of sense when one considers that this system essentially plays ‘cat or bread’, except with cancer.

These gotchas with recognizing legitimate cancer cases are why the study authors see it mostly as a useful tool for a pathologist. One of the authors, Dr. Niels Olsen, notes that back when he was stationed at the naval base in Guam, he would have liked to have a system like ARM to provide him as one of the two pathologists on the island with an easy source of a second opinion.

(Heading image: Dr. Niels Olson uses the Augmented Reality Microscope. (Credit: US Department of Defense) )

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

A PCB with several points highlighted by a projection system

Augmented Reality Workbench Helps You To Debug Your Boards

No matter how advanced your design skills, the chances are you’ll need to spend some time chasing bugs in your boards after they come back from the assembly house. Testing and debugging a PCB typically involves a lot of cross-checking between the board, the layout and the schematic, which quickly becomes tiresome even for mildly complex designs. To make this task a bit easier, [Ishan Chatterjee] and colleagues at the University of Washington have designed the Augmented Reality Debugging Workbench, or ARDW for short.

The ARDW is a setup consisting of a lab workbench with an antistatic mat, a selection of measurement instruments and a PC. You can simply place your board on the bench, open the schematic and layout in KiCAD and start measuring and debugging your design as you normally would, but the real magic happens when you select a new icon in KiCAD that exports the schematic and layout to the ARDW system. From that moment, you can select components in your schematic and have them highlighted not only on the layout, but on the physical board in front of you as well. This is perhaps best demonstrated visually, as the team members do in the video embedded below.

The real-life highlighting of components is achieved thanks to a set of cameras that track the motion of everything on the desk as well as a video projector that overlays information on top of the PCB. All of this enables a variety of useful debugging features: for example, there’s an option to highlight pin one on all components, enabling a simple visual check of each component’s orientation. You can select all Do Not Populate (DNP) instances and immediately see if all highlighted pads are empty. If you’re not sure which component you’re looking at, just point at it with your multimeter probe and it’s highlighted on the schematic and layout. You can even place your probes on a net and automatically log the voltage for future reference, thanks to a digital link between the multimeter and the ARDW software.

In addition to designing and building the ARDW, the team also performed a usability study using a group of human test subjects. They especially liked the ability to quickly locate components on crowded boards, but found the on-line measurement system a bit cumbersome due to its limited positional accuracy. Future work will therefore focus on improving the resolution of the projected image and generally making the system more compact and robust. All software is freely available on the project’s GitHub page, and while the current system looks a little complex for hobbyist use, we can already imagine it being a useful tool in production environments.

It’s not even the first time augmented reality has been used for PCB debugging: we saw a somewhat similar system at the 2019 Hackaday Superconference. AR can also come in handy during the design and prototyping phase, as demonstrated by this AR breadboard.

Continue reading “Augmented Reality Workbench Helps You To Debug Your Boards”

Laser Augmented Reality Glasses Show You The Way

Tech companies like Google and Microsoft have been working on augmented reality (AR) wearables that can superimpose images over your field of view, blurring the line between the real and virtual. Unfortunately for those looking to experiment with this technology, the devices released so far have been prohibitively expensive.

While they might not be able to compete with the latest Microsoft HoloLens, these laser AR classes from [Joel] promise to be far cheaper and much more approachable for hackers. By bouncing a low-power laser off of a piezo-actuated mirror, the hope is that the glasses will be able to project simple vector graphics onto a piece of reflective film usually used for aftermarket automotive heads-up displays (HUDs).

Piezo actuators are used to steer the mirror.

[Joel] has put together a prototype of what the mirror system might look like, but says driving the high-voltage piezo actuators poses some unique challenges. The tentative plan is to generate the vector data with a smartphone application, send it to an ESP32 microcontroller within the glasses, and then push the resulting analog signals through a 100 V DC-DC boost converter to get the mirror moving.

We’ve seen the ESP32 drive a laser galvanometer to play a game of Asteroids, but recreating such a setup in a small enough package to fit onto a pair of glasses would certainly be an impressive accomplishment. Early tests look promising, but clearly [Joel] has quite a bit of work ahead of him. As a finalist for the Rethink Displays challenge of the 2021 Hackaday Prize, we’re looking forward to seeing the project develop over the coming months.

Augmented Reality On The Cheap With ESP32

Augmented reality (AR) technology hasn’t enjoyed the same amount of attention as VR, and seriously lags in terms of open source development and accessibility.  Frustrated by this, [Arnaud Atchimon] created CheApR, an open source, low cost AR headset that anyone can build at home and use as a platform for further development

[Arnaud] was impressed by the Tilt Five AR goggles, but the price of this cutting edge hardware simply put it out of reach of most people. Instead, he designed and built his own around a 3D printed frame, ESP32, cheap LCDs, and lenses from a pair of sunglasses. The electronics is packed horizontally in the top of the frame, with the displays pointed down into a pair of angled mirrors, which reflect the image onto the sunglasses lenses and into the user’s eyes. [Arnaud] tested a number of different lenses and found that a thin lens with a slight curve worked best. The ESP32 doesn’t actually run the main software, it just handles displaying the images on the LCDs. The images are sent from a computer running software written in Processing. Besides just displaying images, the software can also integrate inputs from a MPU6050 IMU and  ESP32 camera module mounted on the goggles. This allows the images to shift perspective as the goggles move, and recognize faces and AR markers in the environment.

All the design files and software is available on GitHub, and we exited to see where this project goes. We’ve seen another pair of affordable augmented reality glasses that uses a smartphone as a display, but it seems the headset that was used are no longer available.

Augmented Reality Aids In The Fight Against COVID-19

“Know your enemy” is the essence of one of the most famous quotes from [Sun Tzu]’s Art of War, and it’s as true now as it was 2,500 years ago. It also applies far beyond the martial arts, and as the world squares off for battle against COVID-19, it’s especially important to know the enemy: the novel coronavirus now dubbed SARS-CoV-2. And now, augmented reality technology is giving a boost to search for fatal flaws in the virus that can be exploited to defeat it.

The video below is a fascinating mix of 3D models of viral structures, like the external spike glycoproteins that give coronaviruses their characteristic crown appearance, layered onto live video of [Tom Goddard], a programmer/analysts at the University of California San Francisco. The tool he’s using is called ChimeraX, a molecular visualization program developed by him and his colleagues. He actually refers to this setup as “mixed reality” rather than “augmented reality”, to stress the fact that AR tends to be an experience that only the user can fully appreciate, whereas this system allows him to act as a guide on a virtual tour of the smallest of structures.

Using a depth-sensing camera and a VR headset, [Tom] is able to manipulate 3D models of the SARS virus — we don’t yet have full 3D structure data for the novel coronavirus proteins — to show us exactly how SARS binds to its receptor, angiotensin-converting enzyme-2 (ACE-2), a protein expressed on the cell surfaces of many different tissue types. It’s fascinating to see how the biding domain of the spike reaches out to latch onto ACE-2 to begin the process of invading a cell; it’s also heartening to watch [Tom]’s simulation of how the immune system responds to and blocks that binding.

It looks like ChimeraX and similar AR systems are going to prove to be powerful tools in the fight against not just COVID-19, but in all kinds of infectious diseases. Hats off to [Tom] and his team for making them available to researchers free of charge.

Continue reading “Augmented Reality Aids In The Fight Against COVID-19”

Debugging PCBs With Augmented Reality

Mihir Shah has designed many a PCB in his time. However, when working through the development process, he grew tired of the messy, antiquated methods of communicating design data with his team. Annotating photos is slow and cumbersome, while sending board design files requires everyone to use the same software and be up to speed. Mihir thinks he has a much better solution by the name of InspectAR, it’s an augmented reality platform that lets you see inside the circuit board and beyond which he demoed during the 2019 Hackaday Superconference.

The InspectAR package makes it easy to visualise signals on the board.

The idea of InspectAR is to use augmented reality to help work with and debug electronics. It’s a powerful suite of tools that enable the live overlay of graphics on a video feed of a circuit board, enabling the user to quickly and effectively trace signals, identify components, and get an idea of what’s what. Usable with a smartphone or a webcam, the aim is to improve collaboration and communication between engineers by giving everyone a tool that can easily show them what’s going on, without requiring everyone involved to run a fully-fledged and expensive electronics design package.

The Supercon talk served to demonstrate some of the capabilities of InspectAR with an Arduino Uno. With a few clicks, different pins and signals can be highlighted on the board as Mihir twirls it between his fingers. Using ground as an example, Mihir first highlights the entire signal. This looks a little messy, with the large ground plane making it difficult to see exactly what’s going on. Using an example of needing a point to attach to for an oscilloscope probe, [Mihir] instead switches to pad-only mode, clearly revealing places where the user can find the signal on bare pads on the PCB. This kind of attention to detail shows the strong usability ethos behind the development of InspectAR, and we can already imagine finding it invaluable when working with unfamiliar boards. There’s also the possibility to highlight different components and display metadata — which should make finding assembly errors a cinch. It could also be useful for quickly bringing up datasheets on relevant chips where necessary.

Obviously, the electronic design space is a fragmented one, with plenty of competing software in the market. Whether you’re an Eagle diehard, Altium fanatic, or a KiCad fan, it’s possible to get things working with InspectAR. Mihir and the team are currently operating out of office space courtesy of Autodesk, who saw the value in the project and have supported its early steps. The software is available free for users to try, with several popular boards available to test. As a party piece for Supercon, our very own Hackaday badge is available if you’d like to give it a spin, along with several Arduino boards, too. We can’t wait to see what comes next, and fully expect to end up using InspectAR ourselves when hacking away at a fresh run of boards!