Bombing The Sky For The Sake Of Radio

If you are familiar with radio propagation you’ll know that radio waves do not naturally bend around the earth. Like light and indeed all electromagnetic radiation if they are given a free space they will travel in a straight line.

At very high frequencies this means that in normal circumstances once a receiver moves over the horizon from a transmitter that’s it, you’re out of range and there can be no communication. But at lower frequencies this is not the case. As you move through the lower end of the VHF into the HF (Short Wave) portion of the spectrum and below, the radio signal routinely travels far further than the horizon, and at the lower HF frequencies it starts to reach other continents, even as far as the other side of the world.

Of course, we haven’t changed the Laws Of Physics. Mr. Scott’s famous maxim still stands. Radio waves at these frequencies are being reflected, from ionised portions of the atmosphere and from the ground, sometimes in multiple “hops”. The science of this mechanism has been the subject of over a hundred years of exploration and will no doubt be for hundreds more, for the atmosphere is an unreliable boiling soup of gasses rather than a predictable mirror for your radio waves.

Radio amateurs have turned pushing the atmosphere to its limits into a fine art, but what if you would prefer to be able to rely on it? The US military has an interest in reliable HF communications as well as in evening out the effects of solar wind on the ionisation of the atmosphere, and has announced a research program involving bombing the upper atmosphere with plasma launched from cubesats. Metal ions will be created from both chemical reactions and by small explosions, and their results on the atmosphere will be studied.

Of course, this isn’t the first time the upper atmosphere has been ionised in military experiments. Both the USA and the USSR exploded nuclear weapons  at these altitudes before the cessation of atmospheric nuclear testing, and more recently have directed high power radio waves with the aim of ionising the upper atmosphere. You may have heard of the USA’s HAARP project in Alaska, but Russia’s Sura Ionospheric Heating Facility near Nizhniy Novgorod has been used for similar work. It remains to be seen whether these latest experiments will meet with success, but we’re sure they won’t be the last of their kind.

We’ve looked at radio propagation in the past with this handy primer, and we’ve also featured a military use of atmospheric reflection with over-the-horizon radar.

Fishbowl Starfish Prime upper atmosphere nuclear test image via Los Alamos National Laboratory. As an image created by an officer or employee of the United States government as part of their official duties this image is in the public domain.

Atari Archaeology Without Digging Up Landfill Sites

We are fortunate to live in an age of commoditized high-power computer hardware and driver abstraction, in which most up-to-date computers have the ability to do more or less anything that requires keeping up with the attention of a human without breaking a sweat. Processors are very fast, memory is plentiful, and 3D graphics acceleration is both speedy and ubiquitous.

Thirty years ago it was a different matter on the desktop. Even the fastest processors of the day would struggle to perform on their own all the tasks demanded of them by a 1980s teenager who had gained a taste for arcade games. The manufacturers rose to this challenge by surrounding whichever CPU they had chosen with custom co-processors, ASICs that would take away the heavy lifting associated with 2D graphics acceleration, or audio and music synthesis.

One of the 1980s objects of computing desire was the Atari ST, featuring a Motorola 68000 processor, a then-astounding 512k of RAM, a GUI OS, high-res colour graphics, and 3.5″ floppy drive storage. Were you to open up the case of your ST you’d have found those ASICs we mentioned as being responsible for its impressive spec.

Jumping forward three decades, [Christian Zietz] found that there was frustratingly little information on the ST ASIC internal workings. Since a trove of backed-up data became available when Atari closed down he thought it would be worth digging through it to see what he could find. His write-up is a story of detective work in ancient OS and backup software archaeology, but it paid off as he found schematics for not only an ASIC from an unreleased Atari product but for the early ST ASICs he was looking for. He found hundreds of pages of schematics and timing diagrams which will surely take the efforts of many Atari enthusiasts to fully understand, and best of all he thinks there are more to be unlocked.

We’ve covered a lot of Atari stories over the years, but many of them have related to their other products such as the iconic 2600 console. We have brought you news of an open-source ST on an FPGA though, and more recently the restoration of an ST that had had a hard life. The title of this piece refers to the fate of Atari’s huge unsold stocks of 2600 console cartridges, such a disastrous marketing failure that unsold cartridges were taken to a New Mexico landfill site in 1983 and buried. We reported on the 2013 exhumation of these video gaming relics.

A tip of the hat to Hacker News for bringing this to our attention.

Atari ST image, Bill Bertram (CC-BY-2.5) via Wikimedia Commons.

From Project To Kit: The Final Furlong

This article is the fifth in a series looking at the process of bringing an electronic kit to market from a personal project. We’ve looked at market research, we’ve discussed making a product from your project and writing the best instructions possible before stuffing your first kits ready for sale. In this article we’ll tackle the different means of putting your kits out there for sale.

Given a box of ready-to-sell kits, what next? You have to find some means of selling them, getting them in front of your customer, making the sale, sending them to the purchaser, and safely collecting their money. A few years ago this was an expensive and risky process involving adverts in print magazines and a lot of waiting, but we are fortunate. The Internet has delivered us all the tools we need to market and sell a product like an electronic kit, and in a way that needn’t cost a fortune. We’ll now run through a few of those options for selling your kits, before looking at shipping, marketing, and post-sales support in the final article in the series.

Continue reading “From Project To Kit: The Final Furlong”

Police Baffled? Send For The Radio Amateurs!

The police force in Evanston, Illinois had a problem on their hands. A mystery transmitter was blocking legal use of radio devices, car key fobs, cellphones, and other transmitters in an area of their city, and since it was also blocking 911 calls they decided to investigate it. Their first call for help went to the FCC who weren’t much use, telling them to talk to the manufacturers of the devices affected.

Eventually they approached the ARRL, the USA’s national amateur radio organisation, who sent along [Kermit Carlson, W9XA] to investigate. He fairly quickly identified the frequencies with the strongest interference and the likely spot from which it originated, and after some investigation it was traced to a recently replaced neon sign power supply. Surprisingly the supply was not replaced with a fault-free unit, its owner merely agreeing to turn it off should any further interference be reported.

The ARRL are highlighting this otherwise fairly unremarkable case to draw attention to the problem of devices appearing on the market with little or no pretence of electromagnetic compatibility compliance. In particular they are critical of the FCC’s lacklustre enforcement response in cases like this one. It’s a significant problem worldwide as huge numbers of very cheap switch-mode mains power supplies have replaced transformers in mains power applications, and in any center of population its effects can be readily seen with an HF radio in the form of a significantly raised RF noise floor. Though we have reported before on the FCC’s investigation of the noise floor problem we’d be inclined to agree with the ARRL that it is effective enforcement of EMC regulations that is key to the solution.

City of Evanston police vehicle picture, [Inventorchris] (CC BY-NC 2.0) via Flickr.

Fail Of The Week: Machining Bismuth

[David Cook]’s summary below the write-up of his experiences working with a bismuth ingot is succinct.

I wasted a weekend learning why elemental bismuth is not commonly used for metal parts.

It’s a fair assessment of his time spent growing unspectacular bismuth crystals, casting a bismuth cylinder that cracked, and machining bismuth only to be left with a very rough finish. But even though he admits the exercise was unsuccessful, he does provide us with a fascinating look at the physical properties of the element.

This is what [David] wanted to make. Alchemist-hp + Richard Bartz with focus stack. (Own work) [CC BY-SA 3.0], via Wikimedia Commons
This is what [David] wanted to make. Alchemist-hp + Richard Bartz with focus stack. (Own work) [CC BY-SA 3.0], via Wikimedia Commons

Bismuth is one of those elements you pass by in your school chemistry lessons, it has applications in machining alloys and as a lead replacement but most of us have never knowingly encountered it in the real world. It’s one of the heavy metals, below antimony and to the right of lead on the Periodic Table. Curious schoolchildren may have heard that like water it expands on solidifying or that it is diamagnetic, and most of us have probably seen spectacular pictures of its crystals coated in colourful iridescent oxides.

It was a Hackaday story about these crystals that attracted [David] to the metal. It has a low enough melting point – 271.5 °C – that it can be liquified on a domestic stove, so mindful of his marital harmony should he destroy any kitchen appliances he bought a cheap electric ring from Amazon to go with his bismuth ingot. and set to work.

His first discovery was that cheap electric rings outdoors aren’t very effective metallurgy furnaces. Relocating to the kitchen and risking spousal wrath, he did eventually melt his bismuth and pick off the top layer once it had resolidified, to reveal some crystals.

These are the bismuth crystals he made.
These are the bismuth crystals he made.

Unfortunately for him, instead of spectacular colors and huge crystals, the sight that greeted him was one of little brilliance. Small grey crystals with no iridescence. It seems the beautiful samples are made by a very slow cooling of the liquid bismuth, followed by a quick pouring off of the remaining molten metal. Future efforts, he assures us, will involve sand-insulated molds and careful temperature monitoring.

Undeterred, he continued with his stock of bismuth and embarked on the creation of a cylinder. Early efforts with a clay mold resulted in cracked cylinders, so in desperation he cast the entirety of the metal in an aluminium baking tray and cut the resulting ingot to a rough piece of stock for turning.

Poor finish on machined bismuth.
Poor finish on machined bismuth.

With the bismuth in the lathe, he then came face to face with what he alluded to in his conclusion above, why machined bismuth parts aren’t something you’ll encounter. His cylinder came out with significantly rough patches on the surface, because bismuth is both crystalline and brittle. He suggests improvements could be made if the metal could be solidified with fewer crystals, but it’s obvious that elemental bismuth on its own is not a winner in the turning stakes.

We suggest you take a look at [David]’s write-up. It may be presented as a Fail of The Week here, but in fact it’s more of a succession of experiments that didn’t work than an unmitigated disaster. The result is an interesting and well-documented read that we’re sure most Hackaday readers will gain something from.

Aside from the bismuth crystals linked to above, we’ve featured bismuth a few times here at Hackaday. A low-temperature soldering process used it in an alloy, and we’ve even featured someone using it in another alloy to print using a RepRap.

Thanks [nebk] for the tip.

Colossus: Face To Face With The First Electronic Computer

When the story of an invention is repeated as Received Opinion for the younger generation it is so often presented as a single one-off event, with a named inventor. Before the event there was no invention, then as if by magic it was there. That apple falling on Isaac Newton’s head, or Archimedes overflowing his bath, you’ve heard the stories. The inventor’s name will sometimes differ depending on which country you are in when you hear the story, which provides an insight into the flaws in the simple invention tales. The truth is in so many cases an invention does not have a single Eureka moment, instead the named inventor builds on the work of so many others who have gone before and is the lucky engineer or scientist whose ideas result in the magic breakthrough before anyone else’s.

The history of computing is no exception, with many steps along the path that has given us the devices we rely on for so much today. Blaise Pascal’s 17th century French mechanical calculator, Charles Babbage and Ada, Countess Lovelace’s work in 19th century Britain, Herman Hollerith’s American tabulators at the end of that century, or Konrad Zuse’s work in prewar Germany represent just a few of them.

So if we are to search for an inventor in this field we have to be a little more specific than “Who invented the first computer?”, because there are so many candidates. If we restrict the question to “Who invented the first programmable electronic digital computer?” we have a much simpler answer, because we have ample evidence of the machine in question. The Received Opinion answer is therefore “The first programmable electronic digital computer was Colossus, invented at Bletchley Park in World War Two by Alan Turing to break the Nazi Enigma codes, and it was kept secret until the 1970s”.

It’s such a temptingly perfect soundbite laden with pluck and derring-do that could so easily be taken from a 1950s Eagle comic, isn’t it. Unfortunately it contains such significant untruths as to be rendered useless. Colossus is the computer you are looking for, it was developed in World War Two and kept secret for many years afterwards, but the rest of the Received Opinion answer is false. It wasn’t invented at Bletchley, its job was not the Enigma work, and most surprisingly Alan Turing’s direct involvement was only peripheral. The real story is much more interesting.

Continue reading “Colossus: Face To Face With The First Electronic Computer”

A PDP-11 On A Chip

If you entered the world of professional computing sometime in the 1960s or 1970s there is a high probability that you would have found yourself working on a minicomputer. These were a class of computer smaller than the colossal mainframes of the day, with a price tag that put them within the range of medium-sized companies and institutions rather than large corporations or government-funded entities. Physically they were not small machines, but compared to the mainframes they did not require a special building to house them, or a high-power electrical supply.

A PDP-11 at The National Museum Of Computing, Bletchley, UK.
A PDP-11 at The National Museum Of Computing, Bletchley, UK.

One of the most prominent among the suppliers of minicomputers was Digital Equipment Corporation, otherwise known as DEC. Their PDP line of machines dominated the market, and can be found in the ancestry of many of the things we take for granted today. The first UNIX development in 1969 for instance was performed on a DEC PDP-7.

DEC’s flagship product line of the 1970s was the 16-bit PDP-11 series, launched in 1970 and continuing in production until sometime in the late 1990s. Huge numbers of these machines were sold, and it is likely that nearly all adults reading this have at some time or other encountered one at work even if we are unaware that the supermarket till receipt, invoice, or doctor’s appointment slip in our hand was processed on it.

During that over-20-year lifespan of course DEC did not retain the 74 logic based architecture of the earliest model. Successive PDP-11 generations featured ever greater integration of their processor, culminating by the 1980s in the J-11, a CMOS microprocessor implementation of a PDP-11/70. This took the form of two integrated circuits mounted on a large 60-pin DIP ceramic wafer. It was one of these devices that came the way of [bhilpert], and instead of retaining it as a curio he decided to see if he could make it work.

The PDP-11 processors had a useful feature: a debugging console built into their hardware. This means that it should be a relatively simple task to bring up a PDP-11 processor like the J-11 without providing the rest of the PDP-11 to support it, and it was this task that he set about performing. Providing a 6402 UART at the address expected of the console with a bit of 74 glue logic, a bit more 74 for an address latch, and a couple of  6264 8K by 8 RAM chips gave him a very simple but functional PDP-11 on a breadboard. He found it would run with a clock speed as high as 11MHz, but baulked at a 14MHz crystal. He suggests that the breadboard layout may be responsible for this. Hand-keying a couple of test programs, he was able to demonstrate it working.

We’ve seen a lot of the PDP-11 on these pages over the years. Of note are a restoration of a PDP-11/04, this faithful reproduction of a PDP-11 panel emulated with the help of a Raspberry Pi, and an entire PDP-11 emulated on an AVR microcontroller. We have indeed come a long way.

Thanks [BigEd] for the tip.