Tiny Drones Navigate Like Real Bugs

When it comes to robotic navigation, the usual approach is to go as technically advanced and “smart” as possible. Yet the most successful lifeforms that we know of follow a completely different approach. With limited senses and cognitive abilities, the success of invertebrates like ants and honeybees lie in cooperation in large numbers. A joint team of researchers from TU Delft, University of Liverpool and Radboud University of Nijmegen, decided to try this approach and experimented with a simple navigation technique to allow a swarm of tiny flying robots to explore an unknown environment.

The drones used were of-the-shelf Crazyflie 2.0 micro quadcopters with add-on boards. Sensors consisted of it’s onboard IMU, simple range finding sensors on a Multi-ranger deck for obstacle detection, and a down pointing optical flow sensor, on a Flow deck, to keep track of the distance travelled.  To navigate, the drones used a “swarm gradient bug algorithm” (SGBA).  Each drone in has different preferred direction of travel from takeoff. When an obstacle encountered, it follows the contour of the obstacle, and then continues  in the preferred direction once the path is clear.  When the battery drops to 60%, it returns to a wireless homing beacon. While this technique might not be the most efficient, it has the major advantage of being “lightweight” enough to implement on a cheap microcontroller, an STM32F4 in this case. The full research article is available for free, and is a treasure trove of information.

The main application researchers have in mind is for search and rescue. A swarm of drones can explore an unstable or dangerous area, and identify key areas to focus rescue efforts on.  This can drastically reduce wasted time and risk to rescue workers. It is always cool to see complex problems being solved with simple solution, and we are keen to see where things go. Check out the video after the break. Continue reading “Tiny Drones Navigate Like Real Bugs”

The Open Source Smart Home

[Tijmen Schep] sends in his project, Candle Smart Home, which is an exhibit of 12 smart home devices which are designed around the concepts of ownership, open source, and privacy.

The central controller runs on a Raspberry Pi which is running Mozilla’s new smart home operating system. Each individual device is Arduino based, and when you click through on the site you get a well designed graphic explaining how to build each device.

It’s also fun to see how many people worked together on this project and added their own flair. Whether it’s a unique covering for the devices or a toggle switch that can toggle itself there’s quite a few personal touches.

As anyone who’s had the sneaking suspicion that Jeff Bezos was listening in to their conversations, we get the need for this. We also love how approachable it makes hacking your own hardware. What are your thoughts?

Autonomous Air Boat Vs Lake Washington

Autonomous vehicles make a regular appearance around here, as does [Daniel Riley] aka [rctestflight]. His fascination with building long-endurance autonomous vehicles continues, and this time he built an autonomous air boat.

This craft incorporates a lot of the lessons learnt from his autonomous boat that used a plastic food container. One of the biggest issues was the submerged propellers kept getting tangled in weeds. This led [Daniel] to move his props above water, sacrificing some efficiency for reliability, and turning it into an air boat. The boat itself is catamaran design with separate 3D printed hulls connected by carbon fibre tubes. As with the tupperware boat, autonomous control is done by the open source Ardupilot software.

During testing [Daniel] had another run in with his old arch-nemesis, seaweed. It turns out the sharp vertical bow is a nice edge for weeds to hook on to, create drag, and screw up the craft’s control. [Daniel]’s workaround involved moving the big batteries to the rear, causing the bows lift almost completely out of the water.

With a long endurance in mind right from the start of the project, [Daniel] put it to the test with a 13 km mission on Lake Washington very early one morning. For most of the mission the boat was completely on its own, with [Daniel] stopping at various points along the lake shore to check on its progress. Everything went smoothly until 10 km into the mission when the telemetry showed it slowing down and angling off course, after which is started going in circles. Lucky for Daniel he was offered a kayak by a lakeside resident, and he managed to recover the half sunken vessel. He suspects the cause of the failure was a slowly leaking hull. [Daniel] is already working on the next version, and were looking forward to seeing what he comes up with. Check out the video after the break. Continue reading “Autonomous Air Boat Vs Lake Washington”

An Open Hardware Laser Engraver For Everyone

Right now, you can get a diode laser engraver on eBay for around $100 USD. That sounds like a deal, but it’ll probably use some arcane proprietary software, won’t be terribly accurate, and the laser itself will almost certainly be fully exposed. Of course there’s no shortage of DIY builds which improve upon this situation greatly, but unfortunately the documentation and instructions to replicate them yourself often leave a lot to be desired.

To get a safe and accurate laser platform into the hands of hackers everywhere, we need more well documented open source designs that are actually built with community in mind. Projects like the Engravinator from [Adam Haile]. This isn’t a one-off design with documentation thrown together after the fact, it’s a fully open hardware engraver with a concise assembly guide that’s built from 3D printed parts and readily available components. You’re free to source and print the parts yourself or, eventually, purchase everything as a kit.

Pen-equipped Engravinator

The microwave-sized Engravinator is built from standard 2020 aluminum extrusion, and offers a workable area of 130mm x 130mm. There’s a hatch on the front of the enclosure for objects that are small enough to fit inside the machine, but the open bottom and handles on the top also allow the user to place the Engravinator directly onto the work surface. [Adam] says this feature can be especially useful if you’re looking to burn a design into a tabletop or other large object.

Outside of the aluminum extrusion and miscellaneous hardware that make up the frame, most of the other parts are 3D printed. Released under the CERN Open Hardware License v1.2 and distributed as both STL and STEP files, the printable parts for the Engravinator are ripe for modification should you be so inclined. The same goes for the DXF files for the enclosure panels, which will need to be cut out of orange acrylic with a CNC or (ironically) a laser.

Continue reading “An Open Hardware Laser Engraver For Everyone”

Giving Sight To The Blind With A Wave Of The Hand

[Jakob Kilian] is working on a glove that he hopes will let the blind “see” their surroundings.

One of the most fascinating examples of the human brain’s plasticity is in its ability to map one sense to another. Some people, for example, report being able to see sound, giving them a supernatural ability to distinguish tones. This effect has also been observed in the visually impaired. There are experiments where grids of electrodes were placed on the tongue or mechanical actuators were placed on the lower back. The signals from a camera were fed into these grids and translated in to shocks or movement. The interesting effect is that the users quickly learned to distinguish objects from this low resolution input. As they continued to use these devices they actually reported seeing the objects as their visual centers took over interpreting this input.

Most of these projects are quite bulky and the usual mess you’d expect from a university laboratory. [Jakob]’s project appears to trend to a much more user-friendly product. A grid of haptics are placed on the back of the user’s hand along with a depth camera. Not only is it somewhat unobtrusive, the back of the hand is very sensitive to touch and the camera is in a prime position to be positioned for a look around the world.

[Jakob] admits that, as an interaction designer, his hardware hacking skills are still growing. To us, the polish and thought that went into this is already quite impressive, so it’s no wonder he’s one of the Hackaday Prize Finalists.

Blend Your Last Frogs. Google Turns A Blind Eye To Flash.

Google has announced that it will no longer index Flash files.

Journey with me to a time in a faraway internet; a time before we had monetized social media. A time when the page you shared with your friends was your page and not a page on someone’s network. Way back when Visual Basic was what Python is now and JavaScript was a hack mostly used for cool effects. A hero arose. Macromedia Flash opened the gates to the interactive web, and for a chunk of time it consumed more than a decent portion of humanity’s attention and artistic output.

Computer art was growing, but was it public? How many grandmothers would see a demo?

New grounds were paved and anyone who wanted to become an animator or a web designer could manage it in a few tutorials. Only a few years before Flash took off, people had started talking about computers as a source for art in mostly theoretical terms. There were demoscenes, university studies, and professional communities, of course, but were they truly public? Suddenly Flash made computer art an everyday thing. How could computers not be used for art? In schools and offices all over the world people of varying technical skill would get links to games, animation, and clever sites sent by their friends and colleagues.

For 23 years Flash has had this incredible creative legacy. Yet it’s not perfect by any means. It’s a constant headache for our friendly neighborhood super-conglomerates. Apple hates how it drains the battery on their mobile devices, and that it’s a little village outside of their walled garden. Microsoft sees it as another endless security violation. They all saw it as a competitor product eating their proprietary code bases. Continue reading “Blend Your Last Frogs. Google Turns A Blind Eye To Flash.”

Supercon Keynote: Dr. Megan Wachs On RISC-V

Hackaday has open-source running deep in our veins — and that goes for hardware as well as software. After all, it’s great to run open-source software, but if it’s running on black-box hardware, the system is only half open. While software has benefited mightily from all of the advantages of community development, the hardware world has been only recently catching up. And so we’ve been following the RISC-V open-source CPU development with our full attention.

Dr. Wachs, making her own wedding ring.

Our keynote speaker for the 2019 Hackaday Superconference is Dr. Megan Wachs, the VP of Engineering at SiFive, the company founded by the creators of the RISC-V instruction-set architecture (ISA). She has also chaired the RISC-V Foundation Debug Task Group, so it’s safe to say that she knows RISC-V inside and out. If there’s one talk we’d like to hear on the past, present, and future of the architecture, this is it.

The RISC-V isn’t a particular chip, but rather it’s a design for how a CPU works, and a standard for the lowest-level language that the machine speaks. In contrast to proprietary CPUs, RISC-V CPUs from disparate vendors can all use the same software tools, unifying and opening their development. Moreover, open hardware implementations for the silicon itself mean that new players can enter the space more easily, bring their unique ideas to life faster, and we’ll all benefit. We can all work together.

It’s no coincidence that this year’s Supercon badge has two RISC-V cores running in its FPGA fabric. When we went shopping around for an open CPU core design, we had a few complete RISC-V systems to pick from, full compiler and development toolchains to write code for them, and of course, implementations in Verilog ready to flash into the FPGA. The rich, open ecosystem around RISC-V made it a no-brainer for us, just as it does for companies making neural-network peripherals or even commodity microcontrollers. You’ll be seeing a lot more RISC-V systems in the near future, on your workbench and in your pocket.

We’re tremendously excited to hear more about the project from the inside, and absolutely looking forward to Megan’s keynote speech!

The Hackaday Superconference is completely sold out, but that doesn’t mean that you have to miss out. We’ll be live-streaming the keynote and all other talks on the Supercon main stage, so subscribe our YouTube channel and you won’t miss a thing.