Shedding New Light On The Voynich Manuscript With Multispectral Imaging

The Voynich Manuscript is a medieval codex written in an unknown alphabet and is replete with fantastic illustrations as unusual and bizarre as they are esoteric. It has captured interest for hundreds of years, and expert [Lisa Fagin Davis] shared interesting results from using multispectral imaging on some pages of this highly unusual document.

We should make it clear up front that the imaging results have not yielded a decryption key (nor a secret map or anything of the sort) but the detailed write-up and freely-downloadable imaging results are fascinating reading for anyone interested in either the manuscript itself, or just how exactly multispectral imaging is applied to rare documents. Modern imaging techniques might get leveraged into things like authenticating sealed packs of Pokémon cards, but that’s not all it can do.

Because multispectral imaging involves things outside our normal perception, the results require careful analysis rather than intuitive interpretation. Here is one example: multispectral imaging may yield faded text visible “between the lines” of other text and invite leaping to conclusions about hidden or erased content. But the faded text could be the result of show-through (content from the opposite side of the page is being picked up) or an offset (when a page picks up ink and pigment from its opposing page after being closed for centuries.)

[Lisa] provides a highly detailed analysis of specific pages, and explains the kind of historical context and evidence this approach yields. Make some time to give it a read if you’re at all interested, we promise it’s worth your while.

Remembering Virginia Norwood, Mother Of NASA’s Landsat Success

Virginia T. Norwood passed away earlier this year at the age of 96, and NASA’s farewell to this influential pioneer is a worth a read. Virginia was a brilliant physicist and engineer, and among her other accomplishments, we have her to thank for the ongoing success of the Landsat program, which continues to this day.

The goal of the program was to image land from space for the purpose of resource management. Landsat 1 launched with a Multispectral Scanner System (MSS) that Norwood designed to fulfill this task. Multispectral imaging was being done from aircraft at the time, but capturing this data from space — not to mention deciding which wavelengths to capture — and getting it back down to Earth required solving a whole lot of new and difficult problems.

Continue reading “Remembering Virginia Norwood, Mother Of NASA’s Landsat Success”

Multispectral Imaging System Built With Raspberry Pi

Multispectral imaging can be a useful tool, revealing all manner of secrets hidden to the human eye. [elad orbach] built a rig to perform such imaging using the humble Raspberry Pi.

The project is built inside a dark box which keeps outside light from polluting the results. A camera is mounted at the top to image specimens installed below, which the Pi uses to take photos under various lighting conditions. The build relies on a wide variety of colored LEDs for clean, accurate light output for accurate imaging purposes. The LEDs are all installed on a large aluminium heatsink, and can be turned on and off via the Raspberry Pi to capture images with various different illumination settings. A sheath is placed around the camera to ensure only light reflected from the specimen reaches the camera, cutting out bleed from the LEDs themselves.

Multispectral imaging is particularly useful when imaging botanical material. Taking photos under different lights can reveal diseases, nutrient deficiencies, and other abnormalities affecting plants. We’ve even seen it used to investigate paintings, too. Video after the break.

Continue reading “Multispectral Imaging System Built With Raspberry Pi”

A Google Pixel 3a with a filter wheel attached to its camera

Hackaday Prize 2022: Multispectral Smartphone Camera Reveals Paintings’ Inner Secrets

Multispectral imaging, or photography using wavelengths other than those in ordinary visible light, has various applications ranging from earth observation to forgery detection in art. For example, titanium white and lead white, two pigments used in different historical eras, look identical in visible light but have distinct signatures in the UV range. Similarly, IR imaging can reveal a painting’s inner layers if the pigments used are transparent to IR.

Equipment for such a niche use is naturally quite pricey, so [Sean Billups] decided to transform an older model smartphone into a handheld multispectral camera, which can help him analyze works of art without breaking the bank. It uses the smartphone’s camera together with a filter wheel attachment that enables it to capture different spectral ranges. [Sean] chose to use a Google Pixel 3a, mainly because it’s cheaply available, but also because it has a good image sensor and camera software. Modifying the camera to enable IR and UV imaging turned out to be a bit of a challenge, however.

Image sensors are naturally sensitive to IR and UV, so cameras typically include a filter to block anything but visible light. To remove this filter from the Pixel’s camera [Sean] had to heat the camera module to soften the adhesive, carefully remove the lens, then glue a piece of plastic to the filter and pull it out once the glue had set. Perfecting this process took a bit of trial and error, but once he managed to effect a clear separation between camera and filter it was simply a matter of reattaching the lens, assembling the phone and mounting the filter wheel on its back.

The 3D-printed filter wheel has slots for four different filters, which can enable a variety of IR, UV and polarized-light imaging modes. In the video embedded below [Sean] shows how the IR reflectography mode can help to reveal the underdrawing in an oil painting. The system is designed to be extendable, and [Sean] has already been looking at adding features like IR and UV LEDs, magnifying lenses and even additional sensors like spectrometers.

We’ve seen a handful of multispectral imaging projects before; this drone-mounted system was a contestant for the 2015 Hackaday Prize, while this project contains an excellent primer on UV imaging.

Continue reading “Hackaday Prize 2022: Multispectral Smartphone Camera Reveals Paintings’ Inner Secrets”

Hackaday Prize Entry: Multispectral Imaging Based On LandSat 7

The Landsat series of earth observing satellites is one of the most successful space programs in history. Millions of images of the Earth have been captured by Landsat satellites, and those images have been put to use for fields as divers as agriculture, forestry, cartography, and geology. This is only possible because of the science equipment on these satellites. These cameras capture a half-dozen or so spectra in red, green, blue, and a few bands of infrared to tell farmers when to plant, give governments an idea of where to send resources, and provide scientists the data they need.

There is a problem with satellite-based observation; you can’t take a picture of the same plot of land every day. Satellites are constrained by Newton, and if you want frequently updated, multispectral images of a plot of land, a UAV is the way to go.

[SouthMade]’s entry for the Hackaday Prize, uSenseCam, does just that. When this open source multispectral camera array is strapped to a UAV, it will be able to take pictures of a plot of land at wavelengths from 400nm to 950nm. Since it’s on a UAV and not hundreds of miles above our heads, the spacial resolution is vastly improved. Where the best Landsat images have a resolution of 15m/pixel, these cameras can get right down to ground level.

Like just about every project involving imaging, the [SouthMade] team is relying on off-the-shelf camera modules designed for cell phones. Right now they’re working on an enclosure that will allow multiple cameras to be ganged together and have custom filters installed.

While the project itself is just a few cameras in a custom enclosure, it does address a pressing issue. We already have UAVs and the equipment to autonomously monitor fields and forests. We’re working on the legality of it, too. We don’t have the tools that would allow these flying robots to do the useful things we would expect, and hopefully this project is a step in the right direction.


The 2015 Hackaday Prize is sponsored by:

Hackaday Prize Entry: Multispectral Imaging For A UAV

At least part of the modern agricultural revolution that is now keeping a few billion people from starving to death can be attributed to remote sensing of fields and crops. Images from Landsat and other earth imaging satellites have been used by farmers and anyone interested in agriculture policy for forty years now, and these strange, false-color pictures are an invaluable resource for keeping the world’s population fed.

The temporal resolution of these satellites is poor, however; it may be a few weeks before an area can be imaged a second time. For some uses, that might be enough.

For his Hackaday Prize entry (and his university thesis), [David] is working on attaching the same kinds of multispectral imaging payloads found on Earth sensing satellites to a UAV. Putting a remote control plane up in the air is vastly cheaper than launching a satellite, and being able to download pictures from a thumb drive is much quicker than a downlink to an Earth station.

Right now, [David] is working with a Raspberry Pi and a camera module, but this is just experimental hardware. The real challenge is in the code, and for that, he’s simulating multispectral imaging using Minecraft. Yes, it’s just a simulation, but an extremely clever use of a video game to simulate flying over a terrain. You can see a video of that separated into red, green, and blue channels below.


The 2015 Hackaday Prize is sponsored by:

Continue reading “Hackaday Prize Entry: Multispectral Imaging For A UAV”