Farming is a challenge under even the best of circumstances. Almost all conventional farmers use some combination of tillers, combines, seeders and plows to help get the difficult job done, but for those like [Taylor] who do not farm large industrial monocultures, more specialized tools are needed. While we’ve featured the Acorn open source farming robot before, it’s back now with new and improved features and a simulation mode to help rapidly improve the platform’s software.
The first of the two new physical features includes a fail-safe braking system. Since the robot uses electric geared hub motors for propulsion, the braking system consists of two normally closed relays which short the motor leads in emergency situations. This makes the motors see an extremely high load and stops them from turning. The robot also has been given advanced navigation facilities so that it can follow custom complex routes. And finally, [Taylor] created a simulation mode so that the robot’s entire software stack can be run in Docker and tested inside a simulation without using the actual robot.
For farmers who are looking to buck unsustainable modern agricultural practices while maintaining profitable farms, a platform like Acorn could be invaluable. With the ability to survey, seed, harvest, and even weed, it could perform every task of larger agricultural machinery. Of course, if you want to learn more about it, you can check out our earlier feature on this futuristic farming machine.
It’s easy to take for granted the constantly-connected, GPS-equipped, navigation device most of us now carry in our pockets. Want to know how to get to that new restaurant you heard about? A few quick taps in Google Maps, and the optimal route given your chosen transportation method will be calculated in seconds. But if you ever find yourself lost in the woods, you might be in for a rude awakening. With no cell signal and a rapidly dwindling battery, that fancy smartphone can quickly end up being about as useful as a rock.
Enter the IndiaNavi, a modernization of the classic paper map that’s specifically designed to avoid the pitfalls that keeps your garden variety smartphone from being a reliable bushcraft tool. The color electronic paper display not only keeps the energy consumption low, but has unbeatable daylight readability. No signal? No problem, as the relevant maps are pre-loaded on the device.
Besides the 5.65 inch e-paper display from Waveshare, the India Navi features a L96 M33 GPS receiver and ESP32-WROOM-32 microcontroller. The 3D printed enclosure that holds the electronics and the lithium pouch battery that powers them is still in the early stages, but we like the book-style design. The focus on simplicity and reliability doesn’t end with the hardware, either. The software is about a straightforward as it gets: just boot the IndiaNavi and you’re presented with a map that shows your current position.
Over the years, we’ve seen plenty of projects that use ultrasonic or time-of-flight sensors as object detection methods for the visually impaired. Ultrasonic sensors detect objects like sonar — they send sound pulses and measure the time it takes for the signal to bounce off the object and come back. Time-of-flight sensors do essentially the same thing, but with infrared light. In either case, the notifications often come as haptic feedback on the wrist or head or whatever limb the ultrasonic module is attached to. We often wonder why there aren’t commercially-made shoes that do this, but it turns out there are, and they’re about to get even better.
Today, Tec-Innovation makes shoes with ultrasonic sensors on the toes that can detect objects up to four meters away. The wearer is notified of obstacles through haptic feedback in the shoes as well as an audible phone notification via Bluetooth. The company teamed up with the Graz University of Technology in Austria to give the shoes robot vision that provides even better detail.
Ultrasonic is a great help, but it can’t detect the topography of the obstacle and tell a pothole from a rock from a wall. But if you have a camera on both feet, you can use the data to determine obstacle types and notify the user accordingly. These new models will still have the ultrasonic sensors to do the initial object detection, and use the cameras for analysis.
Whenever they do come out, the sensors will all be connected through the app, which paves the way for crowdsourced obstacle maps of various cities. The shoes will also be quite expensive. Can you do the same thing for less? Consider the gauntlet thrown!
[SamsonMarch] designs electronic products by day and — apparently — does it in his spare time, too. His latest is a pair of really cool shades that give him turn-by-turn directions as he walks around town. Unlike some smart glasses, these get around the difficult problem of building a heads-up display by using a very simple interface based on colored LEDs visible to your peripheral vision in the temples of the frames.
The glasses themselves look great; designed in Fusion 360 and cut out of wood, no one would give them a second glance. [Sam] says you could 3D print them, too, but we think the wood looks best even if the stock is a cheap bamboo cutting board. He also cut the lenses out of acrylic.
The slots in the temples are where the action is, though. An iPhone app takes input and talks to Apple services to get directions. A lot of thought went into making the app work even though the phone keeps trying to put it to sleep. Each PCB hosts an RGB LED for indicating left/right turn and destination. They talk to the app using BLE and include accelerometers which put the boards — powered by coin cells — into sleep mode when no movement is detected.
[James Bruton] is an impressive roboticist, building all kinds of robots from tracked, exploring robots to Boston Dynamics-esque legged robots. However, many of the robots are proof-of-concept builds that explore machine learning, computer vision, or unique movements and characteristics. This latest build make use of everything he’s learned from building those but strives to be useful on a day-to-day basis as well, and is part of the beginning of a series he is doing on building a Really Useful Robot. (Video, embedded below.)
While the robot isn’t quite finished yet, his first video in this series explores the idea behind the build and the construction of the base of the robot itself. He wants this robot to be able to navigate its environment but also carry out instructions such as retrieving a small object from a table. For that it needs a heavy base which is built from large 3D-printed panels with two brushless motors with encoders for driving the custom wheels, along with a suspension built from casters and a special hinge. Also included in the base is an Nvidia Jetson for running the robot, and also handling some heavy lifting tasks such as image recognition.
As of this writing, [James] has also released his second video in the series which goes into detail about the mapping and navigation functions of the robots, and we’re excited to see the finished product. Of course, if you want to see some of [James]’s other projects be sure to check out his tracked rover or his investigations into legged robots.
You’ve built a brand new project, and it’s a wonderful little thing that’s out and about in the world. The only problem is, you need to know its location to a decent degree of accuracy. Thankfully, GPS is a thing! With an off-the-shelf module, it’s possible to get all the location data you could possibly need. But how do you go about it, and what parts are the right ones for your application? For the answers to these questions, read on! Continue reading “How To Choose The Right GPS Module For Your Project”→
Keeping track of position is crucial in a lot of situations. On Earth, it’s usually relatively straight-forward, with systems having been developed over the centuries that would allow one to get at least a rough fix on one’s position on this planet. But for a satellite out in space, however, it’s harder. How do they keep their communications dishes pointed towards Earth?
The stars are an obvious orientation point. The Attitude and Articulation Control Subsystem (AACS) on the Voyager 1 and 2 space probes has the non-enviable task of keeping the spacecraft’s communication dish aligned precisely with a communications dish back on Earth, which from deep space is an incomprehensibly tiny target.
Back on Earth, the star tracker concept has become quite popular among photographers who try to image the night skies. Even in your living room, VR systems also rely on knowing the position of the user’s body and any peripherals in space. In this article we’ll take a look at the history and current applications of this type of position tracking. Continue reading “Star Trackers: Telling Up From Down In Any Space”→