The National Institute Of Standards and Technology was founded on March 3, 1901 as the National Bureau of Standards, taking on its current moniker in 1988. The organisation is charged by the government with ensuring the uniformity of weights and measures across the United States, and generally helping out industry, academia and other users wherever some kind of overarching standard is required.
One of the primary jobs of NIST is the production and sale of Standard Reference Materials, or SRMs. These cover a huge variety of applications, from steel samples to concrete and geological materials like clay. However, there are also edible SRMS, too. Yes, you can purchase yourself a jar of NIST Standard Peanut Butter, though you might find the price uncompetitive with the varieties at your local supermarket. Let’s dive into why these “standard” foods exist, and see what’s available from the shelves of our favourite national standards institute. Continue reading “The Mouth-Watering World Of NIST Standard Foods”→
It’s true what they say — you never know what you can do until you try. Russell Kirsch, who developed the first digital image scanner and subsequently invented the pixel, was a firm believer in this axiom. And if Russell had never tried to get a picture of his three-month-old son into a computer back in 1957, you might be reading Hackaday in print right now. Russell’s work laid the foundation for the algorithms and storage methods that make digital imaging what it is today.
Russell A. Kirsch was born June 20, 1929 in New York City, the son of Russian and Hungarian immigrants. He got quite an education, beginning at Bronx High School of Science. Then he earned a bachelor’s of Electrical Engineering at NYU, a Master of Science from Harvard, and attended American University and MIT.
In 1951, Russell went to work for the National Bureau of Standards, now known as the National Institutes of Science and Technology (NIST). He spent nearly 50 years at NIST, and started out by working with one of the first programmable computers in America known as SEAC (Standards Eastern Automatic Computer). This room-sized computer built in 1950 was developed as an interim solution for the Census Bureau to do research (PDF).
Like the other computers of its time, SEAC spoke the language of punch cards, mercury memory, and wire storage. Russell Kirsch and his team were tasked with finding a way to feed pictorial data into the machine without any prior processing. Since the computer was supposed to be temporary, its use wasn’t as tightly controlled as other computers. Although it ran 24/7 and got plenty of use, SEAC was more accessible than other computers, which allowed time for bleeding edge experimentation. NIST ended up keeping SEAC around for the next thirteen years, until 1963.
The Original Pixel Pusher
The term ‘pixel’ is a shortened portmanteau of picture element. Technically speaking, pixels are the unit of length for digital imaging. Pixels are building blocks for anything that can be displayed on a computer screen, so they’re kind of the first addressable blinkenlights.
As the drum slowly rotated, a photo-multiplier moved back and forth, scanning the image through a square viewing hole in the wall of a box. The tube digitized the picture by transmitting ones and zeros to SEAC that described what it saw through the square viewing hole — 1 for white, and 0 for black. The digital image of Walden is 76 x 76 pixels, which was the maximum allowed by SEAC.
In in the video below, Russell discusses the idea and proves that variable pixels make a better image with more information than square pixels do, and with significantly fewer pixels overall. It takes some finagling, as pixel pairs of triangles and rectangles must be carefully chosen, rotated, and mixed together to best represent the image, but the image quality is definitely worth the effort. Following that is a video of Russell discussing SEAC’s hardware.
Russell retired from NIST in 2001 and moved to Portland, Oregon. As of 2012, he could be found in the occasional coffeehouse, discussing technology with anyone he could engage. Unfortunately, Russell developed Alzheimer’s and died from complications on August 11, 2020. He was 91 years old.
Buried on page 25 of the 2019 budget proposal for the National Institute of Standards and Technology (NIST), under the heading “Fundamental Measurement, Quantum Science, and Measurement Dissemination”, there’s a short entry that has caused plenty of debate and even a fair deal of anger among those in the amateur radio scene:
NIST will discontinue the dissemination of the U.S. time and frequency via the NIST radio stations in Hawaii and Ft. Collins, CO. These radio stations transmit signals that are used to synchronize consumer electronic products like wall clocks, clock radios, and wristwatches, and may be used in other applications like appliances, cameras, and irrigation controllers.
The NIST stations in Hawaii and Colorado are the home of WWV, WWVH, and WWVB. The oldest of these stations, WWV, has been broadcasting in some form or another since 1920; making it the longest continually operating radio station in the United States. Yet in order to save approximately $6.3 million, these time and frequency standard stations are potentially on the chopping block.
What does that mean for those who don’t live and breathe radio? The loss of WWV and WWVH is probably a non-event for anyone outside of the amateur radio world. In fact, most people probably don’t know they even exist. Today they’re primarily used as frequency standards for calibration purposes, but in recent years have been largely supplanted by low-cost oscillators.
But WWVB on the other hand is used by millions of Americans every day. By NIST’s own estimates, over 50 million timepieces of some form or another automatically synchronize their time using the digital signal that’s been broadcast since 1963. Therein lies the debate: many simply don’t believe that NIST is going to shut down a service that’s still actively being used by so many average Americans.
The problem lies with the ambiguity of the statement. That the older and largely obsolete stations will be shuttered is really no surprise, but because the NIST budget doesn’t specifically state whether or not the more modern WWVB is also included, there’s room for interpretation. Especially since WWVB and WWV are both broadcast from Ft. Collins, Colorado.
What say the good readers of Hackaday? Do you think NIST is going to take down the relatively popular WWVB? Are you still using devices that sync to WWVB, or have they all moved over to pulling their time down over the Internet? If WWVB does go off the air, are you prepared to setup your own pirate time station?
Have you ever stood under a dome and whispered, only to hear the echo of your voice come back much louder? Researchers at NIST used a similar principle to improve the atomic force microscope (AFM), allowing them to measure rapid changes in microscopic material more accurately than ever before.
An AFM works by using a minuscule sharp probe. The instrument detects deflections in the probe, often using a piezoelectric transducer or a laser sensor. By moving the probe against a surface and measuring the transducer’s output, the microscope can form a profile of the surface. The NIST team used a laser traveling through a circular waveguide tuned to a specific frequency. The waveguide is extremely close (150 nm) to a very tiny probe weighing about a trillionth of a gram. When the probe moves a very little bit, it causes the waveguide’s characteristics to change to a much larger degree and a photodetector monitoring the laser light passing through the resonator can pick this up.
Getting cryptography right isn’t easy, and it’s a lot worse on constrained devices like microcontrollers. RAM is usually the bottleneck — you will smash your stack computing a SHA-2 hash on an AVR — but other resources like computing power and flash code storage space are also at a premium. Trimming down a standard algorithm to work within these constraints opens up the Pandora’s box of implementation-specific flaws.
Still, there are some concrete recommendations. Here are some spoilers. For encryption, they recommend a trimmed-down version of AES-128, which is a well-tested block cipher on the big machines. For message authentication, they’re happy with Galois/Counter Mode and AES-128.
I was most interested in hashing, and came away disappointed; the conclusion is that the SHA-2 and SHA-3 families simply require too much state (and RAM) and they make no recommendation, leaving you to pick among less-known functions: check out PHOTON or SPONGENT, and they’re still being actively researched.
If you think small-device security is easy, read through the 22-question checklist that starts on page twelve. And if you’re looking for a good starting point to read up on the state of the art, the bibliography is extensive.
We all know that it’s not the volts that kill you, it’s the amps. But exactly how many electrons per second are there in an amp? It turns out that nobody really knows. But according to a press release from the US National Institute of Standards and Technology (NIST), that’s all going to change in 2018.
The amp is a “metrological embarrassment” because it’s not defined in terms of any physical constants. Worse, it’s not even potentially measurable, being the “constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed 1 meter apart in vacuum, would produce between these conductors a force equal to 2 x 10–7 newton per meter of length.” You can’t just order a spool of infinite length and negligible cross-section wire and have it express shipped.
So to quantify the exact number of electrons per second in an amp, the folks at NIST need an electron counter. This device turns out to be a super-cooled, quantum mechanical gate that closes itself once an electron has passed through. Repeatedly re-opening one of these at gigahertz still provides around a picoamp. Current (tee-hee) research is focused on making practical devices that push a bit more juice. Even then, it’s likely that they’ll need to gang 100 of these gates to get even a single microamp. But when they do, they’ll know how many electrons per second have passed through to a few tens of parts per billion. Not too shabby.
We had no idea that the amp was indirectly defined, but now that we do, we’re looking forward to a better standard. Thanks, NIST!
Imagine a world where the most widely-used cryptographic methods turn out to be broken: quantum computers allow encrypted Internet data transactions to become readable by anyone who happened to be listening. No more HTTPS, no more PGP. It sounds a little bit sci-fi, but that’s exactly the scenario that cryptographers interested in post-quantum crypto are working to save us from. And although the (potential) threat of quantum computing to cryptography is already well-known, this summer has seen a flurry of activity in the field, so we felt it was time for a recap.
How Bad Is It?
If you take the development of serious quantum computing power as a given, all of the encryption methods based on factoring primes or doing modular exponentials, most notably RSA, elliptic curve cryptography, and Diffie-Hellman are all in trouble. Specifically, Shor’s algorithm, when applied on a quantum computer, will render the previously difficult math problems that underlie these methods trivially easy almost irrespective of chosen key length. That covers most currently used public-key crypto and the key exchange that’s used in negotiating an SSL connection. That is (or will be) bad news as those are what’s used for nearly every important encrypted transaction that touches your daily life.