Long ago, when mainframes ruled the earth, computers were mute. In this era before MP3s and MMUs, most home computers could only manage a simple beep or two. Unless you had an add-on device like the Covox Speech Thing, that is. This 1986 device plugged into your parallel port and allowed you to play sound. Glorious 8-bit, mono sound. [Yeo Kheng Meng] had heard of this device, and wondered what it would take to get it running again on a modern Linux computer. So he found out in the best possible way: by doing it.
It has become commonplace to yell out commands to a little box and have it answer you. However, voice input for the desktop has never really gone mainstream. This is particularly slow for Linux users whose options are shockingly limited, although decent speech support is baked into recent versions of Windows and OS X Yosemite and beyond.
There are four well-known open speech recognition engines: CMU Sphinx, Julius, Kaldi, and the recent release of Mozilla’s DeepSpeech (part of their Common Voice initiative). The trick for Linux users is successfully setting them up and using them in applications. [Michael Sheldon] aims to fix that — at least for DeepSpeech. He’s created an IBus plugin that lets DeepSpeech work with nearly any X application. He’s also provided PPAs that should make it easy to install for Ubuntu or related distributions.
You can see in the video below that it works, although [Michael] admits it is just a starting point. However, the great thing about Open Source is that armed with a working set up, it should be easy for others to contribute and build on the work he’s started.
If you’ve played Valve’s masterpiece Portal, there’s probably plenty of details that stick in your mind even a decade after its release. The song at the end, GLaDOS, “The cake is a lie”, and so on. Part of the reason people are still talking about Portal after all these years is because of the imaginative world building that went into it. One of these little nuggets of creativity has stuck with [Alexander Isakov] long enough that it became his personal mission to bring it into the real world. No, it wasn’t the iconic “portal gun” or even one of the oft-quoted robotic turrets. It’s that little clock that plays a jingle when you first start the game.
Alright, so perhaps it isn’t the part of the game that we would be obsessed with turning into a real-life object. But for whatever reason, [Alexander] simply had to have that radio. Of course, being the 21st century and all his version isn’t actually a radio, it’s a Bluetooth speaker. Though he did go through the trouble of adding a fake display showing the same frequency as the one in-game was tuned to.
The model he created of the Portal radio in Fusion 360 is very well done, and available on MyMiniFactory for anyone who might wish to create their own Aperture Science-themed home decor. Though fair warning, due to its size it does consume around 1 kg of plastic for all of the printed parts.
For the internal Bluetooth speaker, [Alexander] used a model which he got for free after eating three packages of potato chips. That sounds about the best possible way to source your components, and if anyone knows other ways we can eat snack food and have electronics sent to our door, please let us know. Even if you don’t have the same eat-for-gear promotion running in your neck of the woods, it looks like adapting the model to a different speaker shouldn’t be too difficult. There’s certainly enough space inside, at least.
Over the years we’ve seen some very impressive Portal builds, going all the way back to the infamous levitating portal gun [Caleb Kraft] built in 2012. Yes, we’ve even seen somebody do the radio before. At this point it’s probably safe to say that Valve can add “Create cultural touchstone” to their one-sheet.
Back in the old days, when we were still twiddling bits with magnetized needles, changing the data on an EPROM wasn’t as simple as shoving it in a programmer. These memory chips were erased with UV light shining through a quartz window onto a silicon die. At the time, there were neat little blacklights in a box sold to erase these chips. There’s little need for these chip erasers now, so how do you erase and program a chip these days? Build your own chip eraser using components that would have blown minds back in the 70s.
[Charles] got his hands on an old 2764 EPROM for a project, but this chip had a problem — there was still data on it. Fortunately, old electronics are highly resistant to abuse, so he pulled out the obvious equipment to erase this chip, a 300 watt tanning lamp. This almost burnt down the house, and after a second round of erasing of six hours under the lamp, there were still unerased bits.
Our ability to generate UV light has improved dramatically over the last fifty years, and [Charles] remembered he had an assortment of LEDs, including a few tiny 5mW UV LEDs. Can five milliwatts do what three hundred watts couldn’t? Yes; the LED had the right frequency to flip a bit, and erasing an EPROM is a function of intensity and time. All you really need to do is shine a LED onto a chip for a few hours.
With this vintage chip erased, [Charles] slapped together an EPROM programmer — with a programming voltage of 21V — out of an ATMega and a bench power supply. It eventually worked, allowing [Charles]’ project, a vintage liquid crystal display, to have the right data using vintage-correct parts.
We once knew a guy who used to tell us that the first ten times he flew in an airplane, he jumped out of it. It was his eleventh flight before he walked off the plane. [Mathias Lasser] has a similar story. Despite being one of the pair who decoded the iCE40 bitstream format a few years ago, he admits in his 34C3 talk that he never learned how to use FPGAs. His talk covers how he reverse engineered the iCE40 and the Xilinx 7 series devices.
If you are used to FPGAs in terms of Verilog and VHDL, [Mathias] will show you a whole new view of rows, columns, and tiles. Even if you don’t ever plan to work at that level, sometimes understanding hardware at the low level will inspire some insights that are harder to get at the abstraction level.
Do you remember your first instrument, the first device you used to measure something? Perhaps it was a ruler at primary school, and you were taught to see distance in terms of centimetres or inches. Before too long you learned that these units are only useful for the roughest of jobs, and graduated to millimetres, or sixteenths of an inch. Eventually as you grew older you would have been introduced to the Vernier caliper and the micrometer screw gauge, and suddenly fractions of a millimetre, or thousandths of an inch became your currency. There is a seduction to measurement, something that draws you in until it becomes an obsession.
Every field has its obsessives, and maybe there are bakers seeking the perfect cup of flour somewhere out there, but those in our community will probably focus on quantities like time and frequency. You will know them by their benches surrounded by frequency standards and atomic clocks, and their constant talk of parts per billion, and of calibration. I can speak with authority on this matter, for I used to be one of them in a small way; I am a reformed frequency standard nut. Continue reading “Confessions Of A Reformed Frequency Standard Nut”
An underappreciated facet of the maker movement is wearable technology. For this week’s Hack Chat, we’re going to be talking all about wearable and fashion tech. This includes motors, lighting, biofeedback, and one significantly overlooked aspect of wearables, washability.
For this week’s Hack Chat, we’re sitting down with Kathryn Blair and Shannon Hoover to talk about the workability and washability of fashion tech. Over the last decade or so, wearable tech has become ever more popular, and these advances in the science aren’t just limited to amazing outfits lined with hundreds of Neopixels. Now, we’re dealing with biofeedback, clothing that regulates your body temperature monitors your vital signs, and necklaces that glow when the sun goes down.
Kathryn and Shannon are part of the team behind MakeFashion, a Calgary-based outfit that has produced over 60 wearable tech garments shown at 40 international events. MakeFashion is introducing designers to wearables through a series of hands-on workshops built around developing wearable electronics and electronic wearables.
One of the key technologies behind MakeFashion is the StitchKit, a development kit that’s now available on Kickstarter designed to add electronics to wearables. This means everything from uglier Christmas sweaters to interactive clothing.
During this Hack Chat, we’re going to be discussing the design and engineering behind fashion technology, including biofeedback, how motors and lighting work with a human body, and how to design for washability. If you have a question for this Hack Chat, add it to the discussion part of the event page.
Our Hack Chats are live community events on the Hackaday.io Hack Chat group messaging. This Hack Chat is going down Friday, January 19th at noon, Pacific time. Time Zones got you down? Here’s a handy countdown timer!
Click that speech bubble to the left, and you’ll be taken directly to the Hack Chat group on Hackaday.io.
You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.