The central controller runs on a Raspberry Pi which is running Mozilla’s new smart home operating system. Each individual device is Arduino based, and when you click through on the site you get a well designed graphic explaining how to build each device.
It’s also fun to see how many people worked together on this project and added their own flair. Whether it’s a unique covering for the devices or a toggle switch that can toggle itself there’s quite a few personal touches.
As anyone who’s had the sneaking suspicion that Jeff Bezos was listening in to their conversations, we get the need for this. We also love how approachable it makes hacking your own hardware. What are your thoughts?
Autonomous vehicles make a regular appearance around here, as does [Daniel Riley] aka [rctestflight]. His fascination with building long-endurance autonomous vehicles continues, and this time he built an autonomous air boat.
This craft incorporates a lot of the lessons learnt from his autonomous boat that used a plastic food container. One of the biggest issues was the submerged propellers kept getting tangled in weeds. This led [Daniel] to move his props above water, sacrificing some efficiency for reliability, and turning it into an air boat. The boat itself is catamaran design with separate 3D printed hulls connected by carbon fibre tubes. As with the tupperware boat, autonomous control is done by the open source Ardupilot software.
During testing [Daniel] had another run in with his old arch-nemesis, seaweed. It turns out the sharp vertical bow is a nice edge for weeds to hook on to, create drag, and screw up the craft’s control. [Daniel]’s workaround involved moving the big batteries to the rear, causing the bows lift almost completely out of the water.
With a long endurance in mind right from the start of the project, [Daniel] put it to the test with a 13 km mission on Lake Washington very early one morning. For most of the mission the boat was completely on its own, with [Daniel] stopping at various points along the lake shore to check on its progress. Everything went smoothly until 10 km into the mission when the telemetry showed it slowing down and angling off course, after which is started going in circles. Lucky for Daniel he was offered a kayak by a lakeside resident, and he managed to recover the half sunken vessel. He suspects the cause of the failure was a slowly leaking hull. [Daniel] is already working on the next version, and were looking forward to seeing what he comes up with. Check out the video after the break. Continue reading “Autonomous Air Boat Vs Lake Washington”→
Right now, you can get a diode laser engraver on eBay for around $100 USD. That sounds like a deal, but it’ll probably use some arcane proprietary software, won’t be terribly accurate, and the laser itself will almost certainly be fully exposed. Of course there’s no shortage of DIY builds which improve upon this situation greatly, but unfortunately the documentation and instructions to replicate them yourself often leave a lot to be desired.
To get a safe and accurate laser platform into the hands of hackers everywhere, we need more well documented open source designs that are actually built with community in mind. Projects like the Engravinator from [Adam Haile]. This isn’t a one-off design with documentation thrown together after the fact, it’s a fully open hardware engraver with a concise assembly guide that’s built from 3D printed parts and readily available components. You’re free to source and print the parts yourself or, eventually, purchase everything as a kit.
The microwave-sized Engravinator is built from standard 2020 aluminum extrusion, and offers a workable area of 130mm x 130mm. There’s a hatch on the front of the enclosure for objects that are small enough to fit inside the machine, but the open bottom and handles on the top also allow the user to place the Engravinator directly onto the work surface. [Adam] says this feature can be especially useful if you’re looking to burn a design into a tabletop or other large object.
Outside of the aluminum extrusion and miscellaneous hardware that make up the frame, most of the other parts are 3D printed. Released under the CERN Open Hardware License v1.2 and distributed as both STL and STEP files, the printable parts for the Engravinator are ripe for modification should you be so inclined. The same goes for the DXF files for the enclosure panels, which will need to be cut out of orange acrylic with a CNC or (ironically) a laser.
One of the most fascinating examples of the human brain’s plasticity is in its ability to map one sense to another. Some people, for example, report being able to see sound, giving them a supernatural ability to distinguish tones. This effect has also been observed in the visually impaired. There are experiments where grids of electrodes were placed on the tongue or mechanical actuators were placed on the lower back. The signals from a camera were fed into these grids and translated in to shocks or movement. The interesting effect is that the users quickly learned to distinguish objects from this low resolution input. As they continued to use these devices they actually reported seeing the objects as their visual centers took over interpreting this input.
Most of these projects are quite bulky and the usual mess you’d expect from a university laboratory. [Jakob]’s project appears to trend to a much more user-friendly product. A grid of haptics are placed on the back of the user’s hand along with a depth camera. Not only is it somewhat unobtrusive, the back of the hand is very sensitive to touch and the camera is in a prime position to be positioned for a look around the world.
[Jakob] admits that, as an interaction designer, his hardware hacking skills are still growing. To us, the polish and thought that went into this is already quite impressive, so it’s no wonder he’s one of the Hackaday Prize Finalists.
Journey with me to a time in a faraway internet; a time before we had monetized social media. A time when the page you shared with your friends was your page and not a page on someone’s network. Way back when Visual Basic was what Python is now and JavaScript was a hack mostly used for cool effects. A hero arose. Macromedia Flash opened the gates to the interactive web, and for a chunk of time it consumed more than a decent portion of humanity’s attention and artistic output.
New grounds were paved and anyone who wanted to become an animator or a web designer could manage it in a few tutorials. Only a few years before Flash took off, people had started talking about computers as a source for art in mostly theoretical terms. There were demoscenes, university studies, and professional communities, of course, but were they truly public? Suddenly Flash made computer art an everyday thing. How could computers not be used for art? In schools and offices all over the world people of varying technical skill would get links to games, animation, and clever sites sent by their friends and colleagues.
For 23 years Flash has had this incredible creative legacy. Yet it’s not perfect by any means. It’s a constant headache for our friendly neighborhood super-conglomerates. Apple hates how it drains the battery on their mobile devices, and that it’s a little village outside of their walled garden. Microsoft sees it as another endless security violation. They all saw it as a competitor product eating their proprietary code bases. Continue reading “Blend Your Last Frogs. Google Turns A Blind Eye To Flash.”→
Hackaday has open-source running deep in our veins — and that goes for hardware as well as software. After all, it’s great to run open-source software, but if it’s running on black-box hardware, the system is only half open. While software has benefited mightily from all of the advantages of community development, the hardware world has been only recently catching up. And so we’ve been following the RISC-V open-source CPU development with our full attention.
Our keynote speaker for the 2019 Hackaday Superconference is Dr. Megan Wachs, the VP of Engineering at SiFive, the company founded by the creators of the RISC-V instruction-set architecture (ISA). She has also chaired the RISC-V Foundation Debug Task Group, so it’s safe to say that she knows RISC-V inside and out. If there’s one talk we’d like to hear on the past, present, and future of the architecture, this is it.
The RISC-V isn’t a particular chip, but rather it’s a design for how a CPU works, and a standard for the lowest-level language that the machine speaks. In contrast to proprietary CPUs, RISC-V CPUs from disparate vendors can all use the same software tools, unifying and opening their development. Moreover, open hardware implementations for the silicon itself mean that new players can enter the space more easily, bring their unique ideas to life faster, and we’ll all benefit. We can all work together.
It’s no coincidence that this year’s Supercon badge has two RISC-V cores running in its FPGA fabric. When we went shopping around for an open CPU core design, we had a few complete RISC-V systems to pick from, full compiler and development toolchains to write code for them, and of course, implementations in Verilog ready to flash into the FPGA. The rich, open ecosystem around RISC-V made it a no-brainer for us, just as it does for companies making neural-network peripherals or even commodity microcontrollers. You’ll be seeing a lot more RISC-V systems in the near future, on your workbench and in your pocket.
We’re tremendously excited to hear more about the project from the inside, and absolutely looking forward to Megan’s keynote speech!
The Hackaday Superconference is completely sold out, but that doesn’t mean that you have to miss out. We’ll be live-streaming the keynote and all other talks on the Supercon main stage, so subscribe our YouTube channel and you won’t miss a thing.
Don’t write off your weird ideas — turn them into reality. For years, woodworkers have used pen bodies as a canvas for showing off beautiful wood. But what’s the fun in that? [JPayneWoodworking] made a pen out of Ramen noodles just to see if he could.
The process is pretty straightforward, as he explains in the build video after the break. He hammered the uncooked noodle mass into pieces small enough to fit a pen blank mold, but not so small that they’re unrecognizable. Then he poured in pigmented epoxy in orange, silver, and black. [JPayneWoodworking] chose those colors for Halloween, but rather than looking freaky, we think it makes the pen look like a bowl of beef broth-y goodness from a fancy Ramen place.
After adding the flavor packet pigments, he put it in a pressure tank to remove all air pockets. Once it sets up, the process is the same as any other pen blank — take it for a spin on the lathe, polish it up, ream it out, and fit it with the parts from a pen kit. We’d like to see the look on the face of the next person to ask [JPayneWoodworking] for a pen.
Want to get into woodworking just to make weird stuff like this? We don’t blame you. But how does a hardware hacker such as yourself get started? [Dan Maloney]’s got you covered.