Showing pulse oximeter and color sensor combining to measure oxygen in blood and skin tone

Perfecting The Pulse Oximeter

We’re always looking for interesting biohacks here on Hackaday, and this new research article describing a calibrated pulse oximeter for different skin tones really caught our attention.

Pulse oximeters are handy little instruments that measure your blood oxygen saturation using photoplethysmography (PPG) and are a topic we’re no strangers to here at Hackaday. Given PPG is an optical technique, it stands to reason that its accuracy could be significantly affected by skin tone and that has been a major topic of discussion recently in the medical field. Given the noted issues with pulse oximeter accuracy, these researchers endeavored to create a better pulse oximeter by quantifying skin pigmentation and using that data to offset errors in the pulse oximeter measurements. A slick idea, but we think their results leave a lot to be desired.

Diagram showing pulse oximeter and color sensor combining to measure oxygen in blood and skin toneTheir idea sounds pretty straightforward enough. They created their own hardware to measure blood oxygen saturation, a smartwatch that includes red and infrared (IR) light-emitting diodes (LED) to illuminate the tissue just below the surface of the skin, and a photosensor for measuring the amount of light that reflects off the skin. But in addition to the standard pulse oximeter hardware, they also include a TCS34725 color sensor to quantify the user’s skin tone.

So what’s the issue? Well, the researchers mentioned calibrating their color sensor to a standard commercially-available dermatology instrument just to make sure their skin pigmentation values match a gold standard, but we can’t find that data, making it a bit hard to evaluate how accurate their color sensor actually is. That’s pretty crucial to their entire premise. And ultimately, their corrected blood oxygen values don’t really seem terribly promising either. For one individual, they reduced their error from 5.44% to 0.82% which seems great! But for another user, their error actually increases from 0.99% to 6.41%. Not so great. Is the problem in their color sensor calibration? Could be.

We know from personal experience that pulse oximeters are hard, so we applaud their efforts in tackling a major problem. Maybe the Hackaday community could help them out?

Teardown: BlackBerry Smart Card Reader

Years before Steve Jobs showed off the first iPhone, the BlackBerry was already the must-have accessory for mobile professionals. Back then, nobody was worried about watching movies or playing the latest games on their mobile devices, they just wanted a secure and fast way to send and receive email on the go. For that, the BlackBerry was king.

Fast forward to today, and the company is just a shell of what it once was. They don’t even bother making their own hardware anymore. Over the last several years they’ve opted to partner with a series of increasingly obscure manufacturers to produce a handful of lackluster Android phones so they still have something to sell to their dwindling userbase. Anyone excited about the new 5G BlackBerry being built by Texas start-up OnwardMobility? Did you even know it was in the works before now?

A DoD Common Access Card

But this article isn’t about BlackBerry phones. It’s about something that’s even more irrelevant to consumers: the BlackBerry Smart Card Reader. Technically, this little device isn’t dependent on the phones of the same name, but it makes sense that Research In Motion (which eventually just renamed itself to BlackBerry Limited) would market the gadget under the brand of their most popular product. Though as you might expect, software was available to allow it to work with the BlackBerry phone that you almost certainly owned if you needed a dedicated smart card reader.

For those who might not be aware, a smart card in this context is a two-factor authentication token contained in an ID card. These are used extensively by organizations such as the Department of Defense, where they’re known as Common Access Cards, that require you to insert your ID card into a reader before you can log into a secure computer system. This sleek device was marketed as a portable reader that could connect to computers over USB or Bluetooth. Worn around your neck with the included lanyard, the battery-powered reader allowed the card itself to remain on the user’s body while still being readable by nearby devices.

Civilians will recognize the basic technology from modern “Chip and PIN” debit and credit cards, but we’ve never had to stick one of those into our laptop just to log in. To be sure, the BlackBerry Smart Card Reader was never intended for the average home computer user, it was sold to companies and organizations that had tight security requirements; which just so happened to be the same places that would likely already be using BlackBerry mobile devices.

Of course, times and technology change. These devices once cost $200 apiece and were purchased in vast quantities for distribution to trusted personnel, but are now all but worthless. Even in new and unopened condition, they can be had for as little as $10 USD on eBay. For that price, it’s certainly worth taking a peek inside. Perhaps the hacker community can even find new applications for these once cutting-edge devices.

Continue reading “Teardown: BlackBerry Smart Card Reader”

Mergers And Acquisitions: Analog Devices Snaps Up Maxim Integrated For $21 B

Analog Devices will acquire Maxim Integrated for $20.9 billion dollars in stock, as reported by Bloomberg this morning.

Perhaps the confusing part of the news is that the Bloomberg article mentions the acquisition will let Analog Devices better compete with Texas Instruments. Wait, didn’t Texas Instruments acquire Maxim back in 2015? Actually, no. There were rumors (reported then by Bloomberg) that TI was nearing an acquisition deal but it fell through in January of 2016.

You may remember that Analog Devices snapped up Linear Tech in a $30 B acquisition back in 2017. Considering this morning’s news, how will they compare to the might of TI? Looks like 2019 revenue for TI was $14.38 B while Analog reported $5.99 B. Add in Maxim’s revenue of $3.1 B and there’s still a David and Goliath scenario here. Although revenue doesn’t tell the whole story and the proverbial slingshot for Analog may be its existing portfolio of high-margin devices, grown even larger with this acquisition.

Considering how the last half decade played out, this might mark the beginning of another wild cycle of mergers and acquisitions. The consolidation trend continues as we approach a world where just a few gigantic semiconductor companies turn production lines up to eleven to fill the world’s insatiable appetite for more powerful electronics (and more electronics in general).

Pluto Might Not Be A Planet, But It Is An SDR Transceiver

Many of the SDR projects we see use a cheap USB dongle. They are great, but sometimes you want more and — especially — sometimes you want to transmit. The Analog Devices ADALM-Pluto SDR is easily available for $200 and sometimes as low as $100 and it both transmits and receives using an Analog AD9363 and a Zynq FPGA. Although you normally use the device to pipe IQ signals to a host computer, you can run SDR applications on the device itself. That requires you to dig into the Zynq tools, which is fun but a topic for another time. In this post, I’m going to show you how you can use GNU Radio to make a simple Morse code beacon in the 2m ham band.

I’ve had one on my bench for quite a while and I’ve played with it a bit. There are several ways to use it with GNU Radio and it seems to work very well. You have to hack it to get the frequency range down a bit. Sure, it might not be “to spec” once you broaden the frequency range, but it seems to work fine. Instead of working from 325 MHz to 3,800 MHz with a 20 MHz bandwidth, the hacked device transceives 70 MHz to 6,000 MHz with 56 MHz bandwidth. It is a simple hack you only have to do once. It tells the device that it has a slightly better chip onboard and our guess is the chips are the same but sorted by performance. So while the specs might be a little off, you probably won’t notice.

Continue reading “Pluto Might Not Be A Planet, But It Is An SDR Transceiver”

A Classy SDR Chip, Decapped

If you are a regular searcher for exotic parts among the virtual pages of semiconductor supplies catalogs, you will have probably noticed that for a given function it is most often the part bearing the Analog Devices logo that is the most interesting. It may have more functionality, perhaps it will be of a higher specification, and it will certainly have a much higher price. [Zeptobars] has decapped and analyzed an AD chip that holds all three of those honors, the AD9361 SDR transceiver.

It’s placed under a slightly inflammatory title, “when microchips are more profitable than drugs“, but does make a good job of answering why a semiconductor device at the very cutting edge of what is possible at the time of release can be so expensive. The AD9361 is an all-in-one SDR transceiver with an astonishing bandwidth, and as such was a particularly special device when it reached the market in 2013. We see some particularly fine examples of on-chip inductors and PLL circuitry that must have consumed a significant design effort to preserve both bandwidth and noise characteristics. This is an item of physical beauty at a microscopic scale as well as one of technical achievement.

The financial analysis puts Analog Devices’s gross profit at about $103 of the $275 retail purchase price of an AD9361. The biggest slice at $105 goes to the distributor, and surprisingly the R&D and manufacturing costs are not as large as you might expect. How accurate these figures are is anybody’s guess, but they are derived from an R&D figure in the published financial report, so there is some credence to be given to them.

We’ve featured [Zeptobar’s] work before more than once. A look at fake Nordic Semi parts for example or a Soviet i8080 clone have received their treatment. Always a source to watch out for!

Mergers And Acquisitions: Analog And Linear

Analog Devices and Linear Technology have announced today they will combine forces to create a semiconductor company worth $30 Billion.

This news follows the very recent acquisition of ARM Holdings by Japan’s SoftBank, and the later mergers, purchases or acquisitions of On and Fairchild, Avago and Broadcom, NXP and Freescale, and Microchip and AtmelIntel and Altera, and a few more we’re forgetting at the moment.

Both Analog and Linear address similar markets; Analog Devices is best known for amps, interface, and power management ICs. Linear, likewise, isn’t known for ‘fun’ devices, but without their products the ‘fun’ components wouldn’t work. Because the product lines are so complimentary, the resulting company will stand to save $150 Million annually after the deal closes.

Analog and Linear are only the latest in a long line of semiconductor mergers and acquisitions, but it will certainly not be the last. The entire industry is consolidating, and the only way to grow is by teaming up with other companies. This leads the question if there will eventually only be one gigantic semiconductor company in the future. You’ll get different answers to that question from different people. Hughes, Fairchild, Convair, Douglas, McDonnell Douglas, North American, Grumman, Northrop, Northrop Grumman, Bell, Cessna, Schweizer and Sikorsky would say yes. Lockheed Martin and Boeing would say no. It’s the same thing.

FERMIAC: The Computer That Advanced Beyond The Manhattan Project

One of the keys to nuclear fission is sustaining a chain reaction. A slow chain reaction can provide clean power for a city, and a fast one can be used to create a weapon that will obliterate a city. These days, kids can learn about Uranium and Plutonium in high school. But just a few generations ago, the idea of splitting the atom was just a lofty goal for the brightest physicists and mathematicians who gathered at Los Alamos National Laboratory under the Manhattan Project.

Decoding the mysteries of nuclear fission required a great deal of experimentation and calculations. One bright physicist in particular made great strides on both fronts. That man was [Enrico Fermi], one of the fathers of the atomic bomb. Perhaps his greatest contribution to moving the research beyond the Manhattan Project was creating a handheld analog computer to do the math for him. This computational marvel is known as the FERMIAC.

What is Fission?

Nuclear fission occurs when a nucleus is split into fragments, a process that unleashes a great deal of energy.  As a handful of neutrons travel through a reactor pile or other fissionable material, a couple of outcomes are possible. Any one neutron collision might result in fission. This means there will be some number of new neutrons whose paths must be tracked. If fission does not occur, the neutrons may simply scatter about upon collision, which changes their speed and trajectory. Some of the neutrons might be absorbed by the material, and others will simply escape it. All of these possibilities depend on the makeup of the material being bombarded and the speed of the neutron.

Fission Diagram by Michalsmid

Every event that happens to a neutron comprises its genealogical history. If this history is recorded and analyzed, a statistical picture starts to emerge that provides an accurate depiction of the fissility of a given material. [Fermi]’s computer facilitated the creation of such a picture by performing mathematical grunt work of testing different materials. It identified which materials were most likely to sustain a reaction.

Before he left Italy and the looming threat of fascism, [Fermi] led a group of young scientists in Rome called the Via Panisperna boys. This group, which included future Los Alamos physicist [Emilio Segrè], ran many experiments in neutron transport. Their research proved that slow neutrons are much better candidates for fission than fast neutrons.

During these experiments, [Fermi] ran through the periodic table, determined to artificially irradiate every element until he got lucky. He never published anything regarding his methods for calculating the outcomes of neutron collisions. But when he got to Los Alamos, [Fermi] found that [Stanislaw Ulam] had also concluded that the same type of repeated random sampling was the key to building an atomic weapon.

The Monte Carlo Method: Shall We Play a Game?

Monte Carlo method applied to approximating the value of π. by CaitlinJo

[Ulam], a Polish-born mathematician who came to the US in 1935, developed his opinion about random sampling due to an illness. While recuperating from encephalitis he played game after game of solitaire. One day, he wondered at the probability of winning any one hand as laid out and how best to calculate this probability. He believed that if he ran through enough games and kept track of the wins, the data would form a suitable and representative sample for modeling his chances of winning. Almost immediately, [Ulam] began to mentally apply this method to problems in physics, and proposed his ideas (PDF) to physicist and fellow mathematician [John von Neumann].

This top-secret method needed a code name. Another Los Alamos player, [Nick Metropolis] suggested ‘Monte Carlo’ in a nod to games of chance. He knew that [Ulam] had an uncle with a propensity for gambling who would often borrow money from relatives, saying that he just had to go to Monte Carlo. The game was on.

The Tricky Math of Fission

Determination of the elements most suitable for fission required a lot of calculations. Fission itself had already been achieved before the start of the Manhattan Project. But the goal at Los Alamos was a controlled, high-energy type of fission suitable for weaponization. The math of fission is complicated largely because of the sheer number of neutrons that must be tracked in order to determine the likelihood and speed of a chain reaction. There are so many variables involved that the task is monumental for a human mathematician.

Stanislaw-Ulam-FERMIAC
[Stanislaw Ulam] and FERMIAC.

After [Ulam] and [von Neumann] had verified the legitimacy of the Monte Carlo method with regard to the creation of nuclear weaponry, they decided that these types of calculations would be a great job for ENIAC — a very early general purpose computer. This was a more intensive task than the one it was made to do: compute artillery firing tables all day and night. One problem was that the huge, lumbering machine was scheduled to be moved from Philadelphia to the Ballistics Research Lab in Maryland, which meant a long period of downtime.

While the boys at Los Alamos waited for ENIAC to be operational again, [Enrico Fermi] developed the idea forego ENIAC and create a small device that could run Monte Carlo simulations instead. He enlisted his colleague [Percy King] to build the machine. Their creation was built from joint Army-Navy cast off components, and in a nod to that great computer he dubbed it FERMIAC.

FERMIAC: Hacking Probabilities

FERMIAC was created to alleviate the necessity of tedious calculations required by the study of neutron transport. This is something of an end-run around brute force. It’s made mostly of brass and resembles a trolley car. In order to use it, several adjustable drums are set using pseudorandom numbers. One of these numbers represents the material being traversed. A random choice is made between fast and slow neutrons. A second digit is chosen to represent the direction of neutron travel, and a third number indicates the distance traveled to the next collision.

FERMIAC in use
FERMIAC in action.

Once these settings are dialed in, the device is physically driven across a 2-D scale drawing of the nuclear reactor or materials being tested. As it goes along, it plots the paths of neutrons through various materials by marking a line on the drawing. Whenever a material boundary is crossed, the appropriate drum is adjusted to represent a new pseudorandom digit.

FERMIAC was only used for about two years before it was completely supplanted by ENIAC. But it was an excellent stopgap that allowed the Manhattan Project to not only continue unabated, but with rapid progress. FERMIAC is currently on display at the Bradbury Science Museum in Los Alamos, New Mexico alongside replicas of Fat Man and Little Boy, the weapons it helped bring to fruition. [Fermi]’s legacy is cemented as one of the fathers of the atomic bomb. But creating FERMIAC cements his legacy as a hacker, too.

After Los Alamos, [Stanislaw Ulam] would continue to make history in the field of nuclear physics. [Enrico Fermi] was opposed to participating in the creation of the exponentially more powerful hydrogen bomb, but [Ulam] accepted the challenge. He proved that Manhattan Project leader [Edward Teller]’s original design was unfeasible. The two men worked together and by 1951 had designed the Teller-Ulam method. This design became the basis for modern thermonuclear weaponry.

Today, the Monte Carlo method is used across many fields to describe systems through randomness and statistics. Many applications for this type of statistical modeling present themselves in fields where probabilities are concerned, like finance, risk assessment, and modeling the universe. Wherever the calculation of all possibilities isn’t feasible, the Monte Carlo method can usually be found.

[Main Image Source: FERMIAC machine by Mark Pellegrini]

UPDATE: Commentor [lwatchdr] pointed out that the use of the FERMIAC began after the Manhattan Project had officially ended in 1946. Although many of the same people were involved, this analog computer wasn’t put into use until about a year later.