Two Cornell students have designed their own multi-factor authentication system. This system uses a PIN combined with a form of voice recognition to authenticate a user. Their system is not as simple as speaking a passphrase, though. Instead, you have to sing the correct tones into the lock.
The system runs on an ATMEL MEGA1284P. The chip is not sophisticated enough to be able to easily identify actual human speech. The team decided to focus their effort on detecting pitch instead. The result is a lock that requires you to sing the perfect sequence of pitches. We would be worried about an attacker eavesdropping and attempting to sing the key themselves, but the team has a few mechanisms in place to protect against this attack. First, the system also requires a valid PIN. An attacker can’t deduce your PIN simply by listening from around the corner. Second, the system also maintains the user’s specific voice signature.
The project page delves much more deeply into the mathematical theory behind how the system works. It’s worth a read if you are a math or audio geek. Check out the video below for a demonstration. Continue reading “SingLock Protects Your Valuables from Shy People”
There’s just something about the holidays and man’s best friend that brings out the best in people. [Tara Anderson], Director of CJP Product Management at 3D Systems, fostered a husky mix named Derby. Derby was born with a congenital defect: his forelegs were underdeveloped with no paws. This precluded the poor fellow from running around and doing all of the things dogs love to do. [Tara] had fitted him with a wheel cart, but she still felt that Derby deserved more mobility and freedom. Deciding that 3D-printed prosthetics was the answer, she turned to her colleagues and collaborated with Derrick Campana, a certified Animal Orthotist, to create a new set of forelegs for Derby.
The design is different from typical leg prosthetics; Tara felt that the typical “running man” design would not work for a dog, since they’d just sink right into the ground. Instead, the “loop” design was used, allowing for more playful canine antics. They were constructed using MultiJet Printing on the 3DS’ ProJet 5500X. MultiJet Printing enabled the prosthetics to be printed with firm and soft parts, both needed for comfort and durability.
Continue reading “Derby’s Got Legs, He Knows How to Use Them”
What do you get when you have a computer-controlled laser pointer and a big sheet of glow in the dark material? Something very cool, apparently. [Riley] put together a great build that goes far beyond a simple laser diode and servo build. He’s using stepper motors and a proper motion control software for this one.
The theory behind the device is simple – point a laser at some glow in the dark surface – but [Riley] is doing this project right. Instead of jittery servos, the X and Y axes of the laser pointer are stepper motors. These are controlled by an Arduino Due and TinyG motion control software. This isn’t [Riley]’s first rodeo with TinyG; we saw him at Maker Faire NYC with a pendulum demonstration that was absolutely phenomenal.
Right now, [Riley] is taking SVG images, converting them to Gcode, and putting them up on some glow in the dark vinyl. Since the Hackaday Skull ‘n Wrenches is available in SVG format, that was an easy call to make on what to display in weird phosphorescent green. You can see a video of that along with a few others below.
Continue reading “Drawing On Glow In The Dark Surfaces With Lasers”
Lumographic images are those patterns you see at the bottom of swimming pools. When water works as a lens, the light patterns of bright and dark are random and wandering based on the waves above. [Matthew] figured out a way to create fixed images from lens shape alone. The images only morph into view clearly when light shines at the proper angle. At near angles an eerie fun-house mirror effect appears, but too far off and it scatters unrecognisably.
The exact method for designing the optics is not explained, though we are sure someone in our readership could figure it out. The artist claims it to be a hundred year old million-variable math problem. The lenses are often quite thick and do not resemble much of anything. The effect however, is sharp, clear and detailed.
At first he suspected he needed astronomically-expensive military-grade 50 nanometer (0.000002″) precision machining for the lenses, but some friends in the autobody industry gave him a few tips to squeeze good enough accuracy from more affordable industrial machines. The technique also allows for images to appear from mirrors and internal reflections. It is probably not something you can 3D print or machine yourself, but it would be interesting to see someone try.
[Matthew]’s work is on display in the “Composite” gallery at the National Museum of Math in New York (MoMath). See the video after the break for a peak at the machinery he uses to manipulate the lenses to enhance the visuals in the exhibit.
Continue reading “Lumographic Images Created With Lens Only”
Whether you call them individually controllable RGB LEDs, WS2812, or NeoPixels, there’s no denying they are extremely popular and a staple of every glowey and blinkey project. Fresh off the reel, they’re nearly useless – you need a controller, and that has led to many people coming up with many different solutions to the same problem. Here’s another solution, notable because it’s the most minimal WS2812 driver we’ve ever seen.
The critical component in this build is NXP’s LPC810, an ARM Cortex M0+ in an 8-pin DIP package. Yes, it’s the only ARM in a DIP-8, but still able to run at 30MHz, and hold a 4kB program.
JeeLabs is using the SPI bus on the LPC810 to clock out data at the rate required by the LEDs. The only hardware required is a small LED to drop the voltage from 5V to 3.3V and a decoupling capacitor. Yes, you could easily get away with this as a one-component build.
The build consists of a ring of sixty WS2812b RGB LEDs, and the chip dutifully clocking out bits at the correct rate. It’s the perfect start to an LED clock project, an Iron Man arc reactor (are we still doing those?), or just random blinkey LEDs stuffed into a wearable.
Thanks [Martyn] for sending this one in.
Take it from someone who has played at the guitar for over 20 years: reading sheet music can be a big stumbling block to musical enjoyment. Playing by ear is somewhat unreliable, tablature only works well if you’re already familiar with the tune and tempo, and pulling melody from chord charts is like weaving fiction from the dictionary. A lot can be said for knowing basic chord formations, but it can be difficult get your fingers to mimic what you see on the page, the screen, or someone else’s fretboard. Enter Ukule-LED, a learning tool and all-around cool project by [Raghav and Jeff] at Cornell.
Ukule-LED uses 16 NeoPixels across the first four positions of the fretboard to teach chord positions. All 16 NeoPixels are connected in series to a single pin on an ATMega1284P, which sits on a board mounted to the bottom of the uke along with power and serial. [Raghav and Jeff] set the NeoPixels below the surface so as not to interrupt playability. The uke can operate in either of two modes, ‘play’, and ‘practice’. In ‘play’ mode, the user feeds it a text file representing a song’s chords, tempo, and time signature. The LEDs show the chord changes in real-time, like a karaoke teleprompter for fingers. In ‘practice’ mode, the user enters a chord through the CLI, and the lights hold steady until they get a new assignment. Knowing which fingers to use where is up to the user.
To add another layer of learning, major chords alight in green, minor chords in red, and 7th chords in blue. These are the currently supported chord types, but the project was built with open, highly extendable Python sorcery available for download and subsequent tinkering. Go on tour after the break.
Continue reading “Tiptoe Through the Tulips in No Time With Ukule-LED”
The ESP8266 Internet of Things module is the latest and greatest thing to come out of China. It’s ideal for turning plastic Minecraft blocks into Minecraft servers, making your toilet tweet, or for some bizarre home automation scheme. This WiFi module is not, however, certified by the FCC. The chipset, on the other hand, is.
Having a single module that’s able to run code, act as a UART to WiFi transceiver, peek and poke a few GPIOs, all priced at about $4 is a game changer, and all your favorite silicon companies are freaking out wondering how they’re going to beat the ESP8266. Now the chipset is FCC certified, the first step to turning these modules into products.
This announcement does come with a few caveats: the chipset is certified, not the module. Each version of the module must be certified by itself, and there are versions that will never be certified by the FCC. Right now, we’re looking at the ESP8266-06, -07, -08, and -12 modules – the ones with a metal shield – as being the only ones that could potentially pass an FCC cert. Yes, those modules already have an FCC logo on them, but you’re looking at something sold for under $5 in China, here.
Anyone wanting to build a product with the ESP will, of course, also need to certify it with the FCC. This announcement hasn’t broken down any walls, but it has cracked a window.