When we take a new Wi-Fi router from its box, the stock antenna is a short plastic stub with a reverse SMA plug on one end. More recent and more fancy routers have more than one such antenna for clever tricks to extend their range or bandwidth, but even if the manufacturer has encased it in mean-looking plastic the antenna inside is the same. It’s a sleeve dipole, think of it as a vertical dipole antenna in which the lower radiator is hollow, and through which the feeder is routed.
These antennas do a reasonable job of covering a typical home, because a vertical sleeve dipole is omnidirectional. It radiates in all horizontal directions, or if you are a pessimist you might say it radiates equally badly in all horizontal directions. [Brian Beezley, K6STI] has an interesting modification which changes that, he’s made a simple Yagi beam antenna from copper wire and part of a plastic yoghurt container, and slotted it over the sleeve dipole to make it directional and improve its gain and throughput in that direction.
Though its construction may look rough and ready it has been carefully simulated, so it’s as good a design as it can be in the circumstances. The simulation predicts 8.6 dB of gain, though as any radio amateur will tell you, always take antenna gain figures with a pinch of salt. It does however provide a significant improvement in range, which for the investment put in you certainly can’t complain at. Give it a try, and bring connectivity back to far-flung corners of your home!
[Wisecracker] likes how the Amazon Echo Dot works, but he doesn’t like how they sound or how they resemble hockey pucks. A little 3D printing, though, and he transformed the Dot into a credible Death Star. That doesn’t sound very friendly, we guess, so he calls it Alex-Star.
What makes it work is the Death Star’s “superlaser” — the weapon operated by a console that looks suspiciously like some studio video equipment — happens to be about the size and shape of a two-inch speaker. [Wisecracker] added a slot to let the sound out of the second speaker. You can see the thing in action in the video below.
But before [Andy Brown] could build this power supply, he had to reverse-engineer the modules. Based on what he learned, and armed with a data sheet for the modules, he designed a controller to take advantage of all the capabilities of them and ended up with a full-featured power supply. The modules are rated for 66 watts total dissipation at 3.3 volts and have a secondary 5-volt output. Using an ATmega328, [Andy] was able to control the module, provide a display for voltage and current, temperature sensing and fan control, and even a UART to allow data logging to a serial port. His design features mainly through-hole components to make the build accessible to everyone. A suitable case is yet to come, and we’re looking forward to seeing the finished product.
A milliohm meter is a very handy piece of test equipment. Most hand-held multimeters cannot measure low resistances and bench meters that can, are usually quite expensive. [barbouri] has shared details of his milliohm meter build on his blog post, and it looks pretty nice.
When using a single pair of leads to measure very low ohms, the resistance of the measuring wires and voltage drops across the various joints become substantial enough to invalidate your measurement. The solution is to use the “Kelvin method” or 4-wire measurement. This involves passing a highly stable current derived from a temperature compensated constant-current source through the unknown resistance, and then using another pair of leads to measure the voltage drop across the resistor, which then gets displayed as a resistance on a voltmeter.
The finished project not only looks good, but is able to measure up to 2Ω with a resolution of 0.0001Ω (that’s 0.1mΩ). The project is originally designed by [Louis] from [Scullcom Hobby Electronics] and [barbouri]’s second iteration adds an improved board layout to the original project.
One of the most famous lectures in the history of technology was delivered by [Douglas Engelbart] in December 1968, at a San Francisco conference. In it he described for the first time most of what we take for granted in our desktop computers and networking today, several years before even the first microprocessor made it to market. It is revered not only because it was the first airing of these ideas, but because it was the event that inspired and influenced many of those who developed them and brought them to market. You may have heard of it by its poplar name: the Mother of All Demos.
This was an exciting time to be a technologist, as it must have been obvious that we lay on the brink of an age of ubiquitous computing. [Engelbart] was by no means alone in looking to the future and trying to imagine the impact that the new developments would have in the decades to come. On the other side of the Atlantic, at the British Post Office Telephone research centre at Dollis Hill, London, his British counterparts were no less active with their crystal ball gazing. In 1969 they produced our film for today, entitled complete with misplaced apostrophe “Telecommunications Services For The 1990’s” , and for our 2017 viewpoint it provides a quaint but fascinating glimpse of what almost might have been.
Until the 1980s, the vast majority of British telephone services were a tightly regulated state monopoly run as part of the Post Office. There were only a few models of telephone available in the GPO catalogue, all of which were fixed installations with none of the phone sockets we take for granted today. Accessories such as autodiallers or answering machines were eye-wateringly expensive luxuries you’d only have found in offices, and since the fax machine was unheard of the height of data transfer technology was the telex. Thus in what later generations would call consumer information technology there really was only one player, so when they made pronouncements on the future they were a good indication of what you were likely to see in your home.
The film starts with a couple having a conversation, she in her bedroom and he in a phone box. Forgotten little touches such as a queue for a phone box or the then-cutting-edge-design Trimphone she’s using evoke the era, and the conversation leaves us hanging with the promise that their conversation would be better with video. After the intro sequence we dive straight into how the GPO thought their future network would look, a co-axial backbone with local circuits as a ring.
The real future-gazing starts with an office phone call to an Australian, at which we’re introduced to their concept of video calling with a colour CRT in a plastic unit that could almost be lifted from the set of The Jetsons. The presenter then goes on to describe a mass information service which we might recognise as something like our WWW, before showing us the terminal in more detail. Alongside the screen is a mock-up of a desktop console with keypad, cassette-based answerphone recorder, and a subscriber identity card slot for billing purposes. Period touches are a brief burst of the old harsh dial tone of a Strowger exchange, and mention of a New Penny, the newly-Decimalised currency. We’re then shown the system transmitting a fax image, of which a hard copy is taken by exposing a photographic plate to the screen.
Perhaps the most interesting sequence shows their idea of how an online information system would look. Bank statements and mortgage information are retrieved, though all with the use of a numeric keypad rather than [Englebart]’s mouse. Finally we see the system being used in a home office, a situation shown as farcical because the worker is continually harassed by his children.
So nearly five decades later, what did they get right and how much did they miss? The area you might expect them to be most accurate is oddly the one in which they failed most. The BT telecommunications backbone is now fibre-optic, and for the vast majority of us the last mile or two is still the copper pair it would have been a hundred years ago. In terms of the services though we have all of the ones they show us even if not in the form they envisaged. Fax and answering machines were everyday items by the 1980s, and though it didn’t gain much traction at the time we had video calling as a feature of most offices by the 1990s. We might however have expected them to anticipate a fax machine with a printer, after all it was hardly new technology. Meanwhile the online service they show us is visibly an ancestor of Prestel, which they launched for the late 1970s and which failed to gain significant traction due to its expense.
Another area they miss is wireless. We briefly see a pager, but even though they had a VHF radio telephone service and the ancestors of our modern cellular services were on the drawing board on the other side of the Atlantic at the time, they completely miss a future involving mobile phones.
The full film is below the break. It’s a charming period production, and the wooden quality of the action shows us that while the GPO engineers might have been telephone experts, they certainly weren’t actors.
[Paul] had seen similar projects (both one-offs and sold as a product), but wanted to do his own take on it. The principle is simple: The device vibrates the objects at one frequency and strobes LEDs at a slightly different frequency (80 and 79.5 Hz, in this case). The difference between the frequencies (the beat frequency) is what your eye perceives as a very slow (0.5 Hz, here) motion.
The start of World War II threw quantum theory research into disarray. Many of the European physicists left Europe all together, and research moved across the ocean to the shores of the United States. The advent of the atomic bomb thrust American physicists into the spotlight, and physicists began to meet on Shelter Island to discuss the future of quantum theory. By this time one thing was certain: the Copenhagen interpretation of quantum theory had triumphed and challenges to it had mostly died off.
This allowed physicists to focus on a different kind of problem. At this point in time quantum theory was not able to deal with transitional states of particles when they are created and destroyed. It was well known that when an electron came into contact with a positron, the two particles were destroyed and formed at least two photons with a very high energy, known as gamma rays. On the flip side, gamma ray photons could spontaneously turn into positron-electron pairs.
No one could explain why this occurred. It had become obvious to the physicists of the day that a quantum version of Maxwell’s electromagnetic field theory was needed to explain the phenomenon. This would eventually give rise to QED, short for quantum electrodynamics. This is a severely condensed story of how that happened.