There’s a certain elite set of chips that fall into the “cold, dead hands” category, and they tend to be parts that have proven their worth over decades, not years. Chief among these is the ubiquitous 555 timer chip, which nearly 50 years after its release still finds its way into the strangest places. Add in other silicon stalwarts like the 741 op-amp and the LM386 audio amp, and you’ve got a Hall of Fame lineup for almost any project.
That’s exactly the complement of chips that powers this fun little dub siren. As [lonesoulsurfer] explains, dub sirens started out as actual sirens from police cars and the like that were used as part of musical performances. The ear-splitting versions were eventually replaced with sampled or synthesized siren effects for recording studio and DJ use, which leads us to the current project. The video below starts with a demo, and it’s hard to believe that the diversity of sounds this box produces comes from just a pair of 555s coupled by a 741 buffer. Five pots on the main PCB control the effects, while a second commercial reverb module — modified to support echo effects too — adds depth and presence. I built-in speaker and a nice-looking wood enclosure complete the build, which honestly sounds better than any 555-based synth has a right to.
Facial recognition is everywhere these days. Cloud servers churn through every picture uploaded to social media, phone cameras help put faces to names, and CCTV systems are being used to trace citizens in their day-to-day lives. You might want to dodge this without arousing suspicion, just for a little privacy now and then. As it turns out, common makeup techniques can help you do just that.
In research from a group at the Ben-Gurion University of the Negev, the team trialled whether careful makeup contouring techniques could fool a facial recognition system. There are no wild stripes or dazzle patterns here; these techniques are about natural looks and are used by makeup artists every day.
The trick is to use a surrogate facial recognition system and a picture of the person who intends to evade. Digital techniques are used to alter the person’s appearance until it fools the facial recognition system. This is then used as a guide for a makeup artist to recreate using typical contouring techniques.
The theory was tested with a two-camera system in a corridor. The individual was identified correctly in 47.57% of frames in which a face was detected when wearing no makeup. With random makeup, this dropped to 33.73%, however with the team’s intentionally-designed makeup scheme applied, the attacker was identified in just 1.22% of frames. (PDF)
The attack relies on having a good surrogate of the facial recognition system one wishes to fool. Else, it’s difficult to properly design appropriate natural-look makeup to fool the system. However, it goes to show the power of contouring to completely change one’s look, both in front of humans and the machines!
Facial recognition remains a controversial issue, but nothing is stopping its rollout across the world. Indeed, your facial profile may already be out there.
There was a time when people like us might own a tube tester and even if you didn’t, you probably knew which drug store had a tube testing machine you could use for free. We aren’t sure that’s a testament to capitalistic ingenuity or an inditement of tube reliability — maybe both. As [Usagi] has been working on some tube-based projects, he decided he needed a tester so he built one. You can see the results in the video, below.
The tester only uses 24V, but for the projects he’s building, that’s close to the operation in the real circuits. He does have a traditional tube tester, but it uses 100s of volts which is a different operating regime.
Regular doorknobs are widely reviled for their bare simplicity, but by and large society has so many other problems that it never really comes up in day to day conversation. Fear not, however, for [Matthew] has created something altogether more special: a doorknob in the shape of his own outstretched hand.
The build was inspired by a similar doorknob at the WNDR museum in Chicago, and its one you can recreate yourself, too. It’s achieved through a multi-stage mold making process. [Matthew]’s first step was to make a flexible mold of his hand using Perfect Mold alginate material to do so.
Once solidified, [Matthew’s] hand was removed and the mold filled with wax. The wax duplicate of [Matthew]’s hand was then used to create an investment plaster mold for casting metal. Vents were added in the end of each fingertip in the mold to allow molten metal to effectively fill the entire cavity.
Once the investment mold was solid and dry, the wax was melted out and it was ready for casting. A propane furnace was used to melt the casting metal and fill the mold using a simple gravity casting method. [Matthew] ended up making two hands, one in aluminium and one in copper. Some cleanup with grinders and a wire wheel, and a replica of [Matthew]’s hand was in his hands!
The Redefine Robots challenge is looking to you for great ideas in making robots part of modern life. For too long, it’s been the vision of what these machines will look like in the future. But what should they look like right now? Sure, that might be C-3PO, but isn’t it more likely that your robot assistant lives on a smart watch, or that labor saving droid helps by passing the butter when limited mobility makes that a challenge for someone. Where are the everyday things that would be better with just a bit of clever technology?
Part of the challenge here is breaking out of that mold developed from decades of seeing robots that tend to take just a few forms; something with four wheels and a camera or bots designed to mimic the human body. One great example of rethinking these stereotypes is [Harry Gao’s] task lighting robot. It uses machine learning to look for your hands on a work surface and move a bright light to make sure you can always see what you’re doing.
Of course movement isn’t a prerequisite, if you want to think of this as a smart automation challenge. The best robots from science fiction are remembered because of their interaction with people — machines with personality. There’s certainly a place in our world for companion robots that keep you company like this entry called Stack-chan. It’s not a replacement for human interaction, but a complement to the way we communicate with each other and the world around us.
You still have time to get in on this round if you make this weekend your own personal hackathon. Ten entries will be selected to receive a $500 prize and move on to the final round at the end of October. Next week we’ll begin the final, wildcard round as we head into the fall and eventually award $25,000 for the top prize!
Quiet electric trolling motors are great for gliding into your favorite fishing spot but require constant correction if wind and water currents are at play. As an alternative to expensive commercial GPS-guided trolling motors, [AlexAsplund] created Vanchor, an open source system for adding autopilot to a cheap trolling motor.
To autonomously control an off-the-shelf trolling motor, [Alex] designed a 3D printed steering unit powered by a stepper motor to attach to the original transom mount over the motor’s vertical shaft. A collar screwed to the shaft locks the motor into the steering unit when the motor is lowered. The main controller is a Raspberry Pi, which hosts a WiFi hotspot and web server for control and configuration using a smartphone. Using navigation data from an e-compass sensor and a marine GPS chart plotter, it can hold position, travel in a specified direction, or follow a defined route. [Alex] is also planning to add the option of using a GPS module instead of a commercial plotter.
For an estimated total of $300, including the motor, this seems like a viable alternative to commercial systems. Of course, it might be possible to add even more features by integrating the open source ArduRover autopilot, as we’ve seen [rctestflight] do on multiple autonomous vessels. You can also build your own open source chart plotter using OpenCPN, which rivals commercial offerings.
Over at the New York Times (NYT) crossword puzzle desk, newly-appointed Games Editorial Director Everdeen Mason has caused a bit of a ruckus and hubbub (both six letter words with U as the 2nd and 5th letter) among digital puzzle solvers. In a short article published in early August, Ms. Mason announced the end of support for the crossword-solving program Across Lite, abruptly terminating a relationship between the two organizations spanning 25 years. But the ramifications extend much deeper than just one application.
The NYT first published its now-famous crossword puzzle back in 1942, appearing every Sunday, and in 1950 it became a daily feature. In 1993, Will Shortz was chosen as the fourth Crossword Puzzle Editor, a position he still holds today. The NYT online crossword puzzles first appeared in 1996 — puzzle files could be downloaded by modem and solved offline using the program Across Lite.
Modems aside, this basic method has continued until now, and a variety of programs and apps have sprung up over the years that allow not only offline play, but with tailored feature sets, such as support for the visually impaired, puzzle fanatics, puzzle creators, team playing, etc. Naturally the NYT joined the party as well, offering the crossword puzzles online and via smart phone apps.