[Uri Shaked] accidentally touched a GPIO pin on his 3.3 V board with a 12 V alligator clip, frying the board. Sound familiar? A replacement would have cost $60, which for him wasn’t cheap. Also, he needed it for an upcoming conference so time was of the essence. His only option was to try to fix it, which in the end involved a delicate chip transplant.
Fortunately, he had the good instinct to feel the metal shield over the nRF52832 immediately after the event. It was hot. Applying 3.3 V to the board now also heated up the chip, confirming for him that the chip was short-circuiting. All he had to do was replace it.
Digging around, he found another nRF52832 on a different board. To our surprise, transplanting it and getting the board up and running again took only an hour, including the time to document it. If that sounds simple, it was only in the way that a skilled person makes something seem simple. It included plenty of delicate heat gun work, some soldering iron microsurgery, and persistence with a JLink debugger. But we’ll leave the details of the operation and its complications to his blog. You can see one of the steps in the video below.
You may think that cathode ray tube (CRT) TVs and monitors have gone the way of the dinosaur, but you’d be wrong. Many still have them for playing video games at home or in arcades, for vintage computing, and yes, even for watching television programs. [Nesmaniac] uses his TV for playing Super Mario Bros but for several years it had a red area in the top right corner due to a nearby lightning strike. Sadly, it stood out particularly well against the game’s blue background. His solution was to make a degaussing coil.
We have an article explaining degaussing in detail but in brief, the red was caused by that area of the metal shadow mask at the front of the display becoming magnetized by the lightning strike. One way to get rid of the red area is to bring a coil near it and gradually move the coil away. The coil has AC from a wall socket running through it, producing an oscillating magnetic field which randomizes the magnetic field on the shadow mask, restoring the colors to their former glory.
You’ll find [Nesmaniac’s] video explaining how he made it below. It’s a little cartoonish but the details are all there, along with the necessary safety warnings. His degaussing coil definitely qualifies as a hack. The coil itself came from a 15″ CRT monitor and his on/off switch came from a jigsaw. A 100 watt light bulb serves as a resistance to minimize current and if more or less current is needed then the bulb can be swapped for one with a different wattage.
To demonstrate it in action and give a few more construction details, we’ve included a second video below by [Arcade Jason] who made his for degaussing arcade game screens.
It’s been a while since we’ve shown a DIY wire bending machine, and [How To Mechatronics] has come up with an elegant design with easy construction through the use of 3D-printed parts which handle most of the inherent complexity. This one also has a Z-axis so that you can produce 3D wire shapes. And as with all wire bending machines, it’s fun to watch it in action, which you can do in the video below along with seeing the step-by-step construction.
One nice feature is that he’s included a limit switch for automatically positioning the Z-axis when you first turn it on. It also uses a single 12 volt supply for all the motors, and the Arduino that acts as the brains. The 5 volts for the one servo motor is converted from 12 using an LM7805 voltage regulator. He’s also done a nice job packaging the Arduino, stepper motor driver boards, and the discrete components all onto a single custom surface mount PCB.
The bender isn’t without some issues though, such as that there’s no automatic method for giving it bending instructions. You can write code for the steps into an Arduino sketch, which is really just a lot of copy and paste, and he’s also provided a manual mode. In manual mode, you give it simple commands from a serial terminal. However, it would be only one step more to get those same commands from a file, or perhaps even convert from G-code or some other format.
Another issue is that the wire straightener puts too much tension on the wire, preventing the feeder from being able to pull the wire along. One solution is to feed it pre-straightened wire, not too much to ask for since it’s really the bending we’re after. But fixing this problem outright could be as simple as changing two parts. For the feeder, the wire is pulled between copper pipe and a flat steel bearing, and we can’t help wondering whether perhaps replacing them with a knurled cylinder and a grooved one would work as the people at [PENSA] did with their DIWire which we wrote about back in 2012. Sadly, the blog entries we linked to no longer work but a search shows that their instructable is still up if you want to check out their feeder parts.
As for the applications, we can think of sculpting, fractal antennas, tracks for marble machines, and really anything which could use a wireframe for its structure. Ideas anyone?
We’ve all heard the range of sounds to be made electronically from mostly discrete components, but what [Kelly Heaton] has achieved with her many experiments is a whole other world, the world of nature to be exact. Her seemingly chaotic circuits create a nighttime symphony of frogs, crickets, and katydids, and a pleasant stroll through her Hackaday.io logs makes how she does it crystal clear and is surely as delightful as taking a nocturnal stroll through her Virginia countryside.
The visual and aural sensations of the video below will surely tempt you further, but in case it doesn’t, here’s a taste. When Radio Shack went out of business, she lost her source of very specific piezo buzzers and so had to reverse engineers theirs to build her own, right down to making her own amplifiers on circular circuit boards and vacuum forming and laser cutting the housings. For the sounds, she starts out with a simple astable multivibrator circuit, demonstrating how to create asymmetry by changing capacitors, and then combining two of the circuits to get something which sounds just like a cricket. She then shows how to add katydids which enhance the nighttime symphony with percussive sounds much like a snare drum or hi-hat. It’s all tied together with her Mother Nature Board built up from a white noise generator, Schmitt trigger, and shift registers to turn on and off the different sound circuits, providing a more unpredictable and realistic nighttime soundscape. The video below shows the combined result, though she admits she’ll never really be finished. And be sure to check out even more photos and videos of her amazing work in the gallery on her Hackaday.io page.
In the 1966 science fiction movie Fantastic Voyage, medical personnel are shrunken to the size of microbes to enter a scientist’s body to perform brain surgery. Due to the work of this year’s winners of the Nobel Prize in Physics, laser tools now do work at this scale.
Arthur Ashkin won for his development of optical tweezers that use a laser to grip and manipulate objects as small a molecule. And Gérard Mourou and Donna Strickland won for coming up with a way to produce ultra-short laser pulses at a high-intensity, used now for performing millions of corrective laser eye surgeries every year.
Here is a look at these inventions, their inventors, and the applications which made them important enough to win a Nobel.
AI is currently popular, so [Chirs Lam] figured he’d stimulate some interest in amateur radio by using it to pull call signs from radio signals processed using SDR. As you’ll see, the AI did just okay so [Chris] augmented it with an algorithm invented for gene sequencing.
His experiment was simple enough. He picked up a Baofeng handheld radio transceiver to transmit messages containing a call sign and some speech. He then used a 0.5 meter antenna to receive it and a little connecting hardware and a NooElec SDR dongle to get it into his laptop. There he used SDRSharp to process the messages and output a WAV file. He then passed that on to the AI, Google’s Cloud Speech-to-Text service, to convert it to text.
Despite speaking his words one at a time and making an effort to pronounce them clearly, the result wasn’t great. In his example, only the first two words of the call sign and actual message were correct. Perhaps if the AI had been trained on actual off-air conversations with background noise, it would have been done better. It’s not quite the same issue, but we’re reminded of those MIT researchers who fooled Google’s Inception image recognizer into thinking that a turtle was a gun.
Rather than train his own AI, [Chris’s] clever solution was to turn to the Smith-Waterman algorithm. This is the same algorithm used for finding similar nucleic acid sequences when analyzing genes. It allowed him to use a list of correct call signs to find the best match for what the AI did come up with. As you can see in the video below, it got the call signs right.
Microsoft is bringing ROS to Window 10. ROS stands for Robot Operating System, a software framework and large collection of libraries for developing robots which we recently wrote an introductory article about, It’s long been primarily supported under Linux and Mac OS X, and even then, best under Ubuntu. My own efforts to get it working under the Raspbian distribution on the Raspberry Pi led me to instead download a Pi Ubuntu image. So having it running with the support of Microsoft on Windows will add some welcome variety.
To announce it to the world, they had a small booth at the recent ROSCon 2018 in Madrid. There they showed a Robotis TurtleBot 3 robot running the Melodic Morenia release of ROS under Windows 10 IoT Enterprise on an Intel Coffee Lake NUC and with a ROS node incorporating hardware-accelerated Windows Machine Learning.
Why are they doing this? It may be to help promote their own machine learning products to roboticists and manufacturing. From their recent blog entry they say:
We’re looking forward to bringing the intelligent edge to robotics by bringing advanced features like hardware-accelerated Windows Machine Learning, computer vision, Azure Cognitive Services, Azure IoT cloud services, and other Microsoft technologies to home, education, commercial, and industrial robots.
Initially, they’ll support ROS1, the version most people will have used, but also have plans for ROS2. Developers will use Microsoft’s Visual Studio toolset. Thus far it’s an experimental release but you can give it a try by starting with the details here.