Despite how it might appear in bad action movies, throwing a knife and making it stick in a target is no easy feat. Taking inspiration from the aforementioned movies, [Quint] and his son built a magazine-fed knife throwing machine, capable of sticking a knife at any distance within its range.
Throwing a sharp piece of metal with a machine isn’t that hard, but timing the spin to hit the target point-first is a real challenge. To achieve this, [Quint] used a pair of high-performance servo motors to drive a pair of parallel timing belts. Mounting a carriage with a rotating knife-holder between the belts allows for a spinning throw by running one belt slightly faster. The carriage slides on a pair of copper rails, which also provide power to the knife holder via a couple of repurposed carbon motor brushes.
At first, the knife holder was an electromagnet, but it couldn’t reliably hold or release the stainless steel throwing knives. This was changed to a solenoid-driven mechanism that locks into slots machined into the knives. Knives are fed automatically from a spring-loaded magazine at the back as long as the trigger is held down, technically making it full-auto. To match the spin rate to the throwing distance, a LIDAR sensor is used to measure the distance, which also adjusts the angle of the aiming laser to compensate for the knife’s trajectory.
The development process was fraught with frustration, failure, and danger. Unreliable knife holders, exploding carriages, and faulty electronics that seemingly fired of their own accord were all challenges that needed to be overcome. However, the result is a machine that can both throw knives and nurture a kid’s passion for building and programming.
The inconvenience of having to walk to your mailbox to check for mail has inspired many hackers to install automated systems that let them know when the mail has been delivered. Mailbox monitors have been made based on several different mechanisms: some measure the weight of the items inside, some use cameras and machine vision, while others simply trigger whenever the mailbox’s door or flap is moved. When [Gary Watts] wanted to install a notification system for his 1940s brick letterbox, his options were limited: with no flap or door to monitor, and limited space to install mechanical contraptions, he decided to use a LIDAR sensor instead.
Probably best-known for their emerging application in self-driving cars, LIDAR systems send out a laser pulse and measure the time it takes for it to be reflected off a surface. In the case of [Gary]’s mailbox, that surface is either the brick wall or a letter leaning against it. Since letters are inserted through a vertical slot, they will usually be leaning upright against the wall, providing a clear target for the laser.
The LIDAR module, a VL53L0X made by ST, is hooked up to a Wemos D1 Mini Pro. The D1 communicates with [Gary]’s home WiFi through an external antenna, and is powered by an 18650 lithium battery charged through a solar panel. The whole system is housed inside a waterproof plastic case, with the LIDAR sensor attached to the inside of the mailbox through a 3D-printed mounting bracket. On the software side, the mailbox notifier is powered by Home Assistant and MQTT. The D1 spends most of its time in deep-sleep mode, only waking up every 25 seconds to read out the sensor and send a notification if needed.
Why? Getting a drone that can fly a path and even return home when the battery is low, signal is lost, or on command, is simple enough. Just go to your favorite retailer, search “gps drone” and you can get away for a shockingly low dollar amount. This is possible because GPS receivers have become cheap, small, light, and power efficient. While all of these inexpensive drones can fly a predetermined path, they usually do so by flying over any obstacles rather than around.
[Nick Rehm] has envisioned a quadcopter that can do all of the things a GPS-enabled drone can do, without the use of a GPS receiver. [Nick] makes this possible by using algorithms similar to those used by Google Maps, with data coming from a typical IMU, a camera for Computer Vision, LIDAR for altitude, and an Intel RealSense camera for detection of position and movement. A Raspberry Pi 4 running Robot Operating System runs the autonomous show, and a Teensy takes care of flight control duties.
What we really enjoy about [Nick]’s video is his clear presentation of complex technologies, and a great sense of humor about a project that has consumed untold amounts of time, patience, and duct tape.
Sure, there are smart canes out there, commercial and otherwise. We’ve seen more than a few over the years. But a group of students at Stanford University have managed to bring something novel to the augmented cane.
Ground effect vehicles, or ekranoplans, have the advantage of being more efficient than normal aircraft and faster than boats, but so far haven’t been developed beyond experimental prototypes. Fortunately, this doesn’t stop companies from trying, which has led to a collaboration between [ThinkFlight] and [rctestflight] to create a small-scale demonstrator for the Flying Ship Company.
The Flying Ship Company wants to use unmanned electric ekranoplans as high-speed marine cargo carriers that can use existing maritime infrastructure for loading and unloading. For the scale model, [rctestflight] was responsible for the electronics and software, while [ThinkFlight] built the airframe. As with his previous ekranoplan build, [ThinkFlight] designed it in XFLR5, cut the parts from foam using a CNC hot wire cutter (which we still want a better look at), and laminated it with Kevlar for strength. One of the challenges of ground effect vehicles is that the center of pressure will shift rearward as they leave a ground effect, causing them to pitch up. To maintain control when moving into and out of ground effect, these crafts often use a large horizontal stabilizer high up on the tail, out of ground effect.
A major feature of this demonstrator is automatic altitude control using a LIDAR sensor mounted on the bottom. This was developed by [rctestflight] using a simple foam board ekranoplan and [Think Flighs]’s previous airframe, with some custom code added to ArduPilot. It works very well on smooth, calm water, but waves introduce a lot of noise into the LIDAR data. It looks like they were able to overcome this challenge, and completed several successful test flights in calm and rough conditions.
The final product looks good, flies smoothly, and is easy to control since the pilot doesn’t need to worry about pitch or throttle control. It remains to be seen if The Flying Boat will overcome the challenges required to turn it into a successful commercial craft, and we will be following the project closely.
Ekranoplans are a curious class of vehicle; most well known for several Soviet craft designed to operate at sea, flying just above the waves in ground effect. [rctestflight] had accidentally come across the ground effect flight regime himself years ago, and decided it was time to build an ekranoplan of his own.
While ground-effect flight can be quite stable for a heavy, human-scale craft, the smaller RC version suffered more from minor perturbations from the wind and such. Thus, a Pixracer autopilot was installed, and combined with a small LIDAR device to accurately measure altitude above the ground. With some custom tweaks to the Ardupilot firmware, the craft was able to cleanly fly along barely a foot off the ground.
The final effect is almost mesmerizing; it appears as if the craft is hovering via some heretofore unknown technology rather than just flying in the usual sense. It’s still sensitive to breezes and sudden drops in the terrain lead to a temporary escape from the ground effect region, but the effect is nonetheless impressive. It’s a nerve wracking video at times, though, with quite a few near misses with traffic and children. Regardless of the nature of your experimental craft, be cognisant of your surroundings. We’ve seen [rctestflight]’s Ardupilot experiments before, too. Video after the break.
The Onion Tau LiDAR Camera is a small, time-of-flight (ToF) based depth-sensing camera that looks and works a little like a USB webcam, but with a really big difference: frames from the Tau include 160 x 60 “pixels” of depth information as well as greyscale. This data is easily accessed via a Python API, and example scripts make it easy to get up and running quickly. The goal is to be an affordable and easy to use option for projects that could benefit from depth sensing.
When the Tau was announced on Crowd Supply, I immediately placed a pre-order for about $180. Since then, the folks at Onion were kind enough to send me a pre-production unit, and I’ve been playing around with the device to get an idea of how it acts, and to build an idea of what kind of projects it would be a good fit for. Here is what I’ve learned so far.