For [Peter Le Roux’] first “real” electronics project, he decided to make a camera based off the venerable Raspberry Pi platform. But he didn’t just want a regular camera, he wanted something that could shoot in near IR wave lengths…
It’s a well-known fact that you can remove the IR blocking filter from most cameras to create a quasi IR camera hack – heck, that hack has been around nearly as long as we have! The problem is even if you let the IR light into the camera’s sensor, you still get all the other light unless you have some kind of filter. There are different ways of doing this, so [Peter] decided to do them all with an adjustable wheel to flip through all the different filters.
He designed the case after the PiBow enclosure – you can see our full Pi Case Roundup here – and had it all laser cut out of wood. Stick around after the break to see a nice explanation of the light spectrum and the various filters [Peter] uses.
[Afrdt] started by making two EMF coil antennas and sewed them to cuffs that snap together. She crafted fashionable fabric stripes that both conceal and carry the cables from the coils to an Adafruit FLORA that’s sewn into the body of the dress. The wearer experiences haptic feedback via vibration motors in the chest, and sonic feedback from a mini female headphone jack built into the collar. The zipper functions as a low-pass filter and volume control for the jack. One side bears resistive tape and runs to the FLORA, which is programmed to play an 800Hz tone. The other side runs to the headphone jack via conductive thread. As the zipper is opened, the pitch increases to toward the maximum pitch of 880Hz.
Much of robotics has been advanced by recreating animals movements – Why reinvent the wheel when nature got it right first? But have you seen many aquatic creatures movements re-imagined with mechanical linkages? The Foundation for Research and Technology-Hellas (FORTH) has recently presented their robotic octopus at the International Conference on Intelligent Robots and Systems.
The eight armed (or is it legged?) roboctopus was based on of a real octopus which have a really cool method of propulsion which allows them to move at speeds of up to 40km/h. The researchers in Greece created slim silicon arms to recreate this movement, allowing their robot to propel itself at speeds of around 10cm/s — after adding webs to the arms, they were able to almost double its top speed to 18cm/s, or one-half its body length per second.
The cool thing about the bot is that other marine wild-life seem relatively unperturbed by it, which could open up many possibilities in underwater research!
Moscow artist [Dmitry Morozov] makes phenomenal geek-art. (That’s not disrespect — rather the highest praise.) And with Solaris, he’s done it again.
The piece itself looks like something out of a sci-fi or horror movie. Organic black forms coalesce and fade away underneath a glowing pool of green fluid. (Is it antifreeze?) On deeper inspection, the blob is moving in correspondence with a spectator’s brain activity. Cool.
You should definitely check out the videos. We love to watch ferrofluid just on its own — watching it bubble up out of a pool of contrasting toxic-green ooze is icing on the cake. Our only wish is that the camera spent more time on the piece itself.
Two minutes into the first video we get a little peek behind the curtain, and of course it’s done with an Arduino, a couple of motors, and a large permanent magnet. Move the motor around with input from an Epoc brain-activity sensor and you’re done. As with all good art, though, the result is significantly greater than the sum of its parts.
[Dmitry’s] work has been covered many, many times already on Hackaday, but he keeps turning out the gems. We could watch this one for hours.
Tomorrow we mark 10 wonderful years of reading Hackaday. Share your experience by recording a 1-2 minute video about how you discovered Hackaday and your favorite hack from all the greats that have hit the front page. Tweet the link to your video to @Hackaday with the hashtag #10years and we’ll add it to the playlist.
It doesn’t need to be anything special (but go nuts if you wish). I recorded a one-shot talking-head format as an example.
If you are lucky enough to be in the LA area, get a free ticket for Saturday’s event. In addition to all the clinicians and speakers, there’s a small collection of the Hackaday crew in town.
I’ve been a huge fan of EMSL for quite some time now, and my recent field trip proved that it has earned the name Evil Mad Scientist Laboratories for a good reason. For instance, look at the reflection in the glass near the bottom and you’ll glimpse the hearse that [Lenore] and [Windell] have sitting in front of the shop. But stop at the threshold, inside there are delights that ate up a couple of hours without me even noticing. And they thought they were going to get work done that day.
Don’t judge me by my appearance. This is late afternoon on a summer Saturday in Sunnyvale. Why does that matter? Obviously summer Saturdays in Silicon Valley always start with the Electronics Swap Meet and Engineer’s breakfast! That was a ton of fun but if you’re doing it right it’s also a bit tiring. No worries, a shot of excitement came over me as soon as I walked in that front door.
Using an Oculus Rift, the Leap Motion controller and a beta run of Unity 4.6, [Tomáš Mariančík] put together a test environment for physical interaction. The Leap Motion controller is capable of tracking your fingers with extremely high detail, which allows him to create a pair of virtual hands inside the test environment that almost perfectly mimic his movements. The hack here is making it all work together.
In the following demo he shows off by interacting with holographic menus, grabbing body parts off of anatomically correct human being (thanks to Unity3D), and manipulating his environment.