The Romo was a little iPhone-controlled robot brought to market with a Kickstarter campaign back in 2013. It originally used the audio jack from the iPhone for the control interface, but was quickly followed by an updated version that used iPhone 4’s 30 pin connector and later the Lightning port. Romotive, the company behind Romo, eventually went out of business, but fortunately, they open-sourced the IOS app and the firmware. This has led to a few third-party apps currently on the app store.
[David] wanted to use other hardware for control, so he set about reverse-engineering the protocol using the open-source software and a logic analyzer. Unsurprisingly, it uses a serial interface to send and receive commands, with two additional pins to detect the connection and wake up the Romo. After breaking out the interface header on the board, he was able to modify the Romo to mount a Raspberry Pi Zero, and power it using the internal battery.
[David] has not made his code public yet, but it sounds like he plans to. It looks like Romo’s can be a fun little experimentation platform, and they can be found for cheap on eBay. We covered another cool Romo hack back in 2014, which used a projector and vision system to create a Mariokart-like game. For a completely open-source smartphone robot, check out the OpenBot.
If you are like us, you’ve wondered what all the hoopla about drones making home deliveries is about. Our battery-operated vehicles carry very little payload and still don’t have a very long range. Add sophisticated smarts and a couple of delivery packages and you are going to need a lot more battery. Or maybe not. Amazon’s recent patent filing shows a different way to do it.
In the proposed scheme, a delivery truck drives to a neighborhood and then deploys a bunch of wheeled or walking drones to deliver in the immediate area. Not only does that reduce the range requirement, but there are other advantages, as well.
The rig is built around an earlier build from [Engineering After Hours], a skid-steer RC chassis that is nice and tough to handle rough and tumble driving. It’s paired with a trailer attached to the center of rotation of the chassis that makes the pair highly maneuverable.
In order to launch rockets, an air tank on the trailer is hooked up to some piping to launch four Nerf rockets. Charged up to just 40 psi, it’s capable of launching the rounds with plenty of power for play purposes. Paired with a elevation control and a servo to trigger the firing valve, it’s a complete system that can shoot on the go.
It’s a fun build that packs a punch, even if it doesn’t quite have the accuracy or range you might desire in an all-conquering Nerf combat platform. We’d love to see a similar build hooked up to some AI smarts to stalk targets independently of human control. Video after the break.
Farming has been undergoing quite a revolution in the past few years. Since World War 2, most industrial farming has relied on synthetic fertilizer, large machinery, and huge farms with single crops. Now there is a growing number of successful farmers bucking that trend with small farms growing many crops and using natural methods of fertilizing that don’t require as much industry. Of course even with these types of farms, some machinery is still nice to have, so this farmer has been developing an open-source automated farming robot.
The robot is known as Acorn and is the project of [taylor] who farms in California. The platform is powered by an 800 watt solar array feeding a set of supercapacitors for energy storage. It uses mountain bike wheels and tires fitted with electric hub motors which give it four wheel drive and four wheel steering to make it capable even in muddy fields. The farming tools, as well as any computer vision and automation hardware, can be housed under the solar panels. This prototype uses an Nvidia Jetson module to handle the heavy lifting of machine learning and automation, with a Raspberry Pi to handle the basic operation of the robot, and can navigate itself around a farm using highly precise GPS units.
While the robot’s development is currently ongoing, [taylor] hopes to develop a community that will build their own versions and help develop the platform. Farming improvements like this are certainly needed as more and more farmers shift from unsustainable monocultures to more ecologically friendly methods involving multiple simultaneous crops, carbon sequestration, and off-season cover crops. It’s certainly a long row to hoe but plenty of people are already plowing ahead.
Imagine you are at the movies and you see a Roomba-like robot climbing a wall or clinging to a ceiling. How would that work? If you are like us, you might think of suction cups or something mechanical or magnetic in the wall. Then again, it is a movie, so maybe it is just a camera trick. The robots from the Bioinsipired Robotics and Design Lab at UCSD are no camera trick, though. As [Evan Ackerman] mentions in a post on IEEE Spectrum, “It’s either some obscure fluid effect or black magic.” You can watch a video about the bots, below.
It turns out, the answer is closer to a suction cup than you might think. According to the paper from the lab, a small flexible disk vibrates at 200 Hz. This generates a thin (less than 1 mm) layer of low pressure air in between the disk and the underlying surface. The robot can resist a force of up to 5 newtons from the suction from the disk.
That collective “Phew!” you heard this week was probably everyone on the Mars Ingenuity helicopter team letting out a sigh of relief while watching telemetry from the sixth and somewhat shaky flight of the UAV above Jezero crater. With Ingenuity now in an “operations demonstration” phase, the sixth flight was to stretch the limits of what the craft can do and learn how it can be used to scout out potential sites to explore for its robot buddy on the surface, Perseverance.
While the aircraft was performing its 150 m move to the southwest, the stream from the downward-looking navigation camera dropped a single frame. By itself, that wouldn’t have been so bad, but the glitch caused subsequent frames to come in with the wrong timestamps. This apparently confused the hell out of the flight controller, which commanded some pretty dramatic moves in the roll and pitch axes — up to 20° off normal. Thankfully, the flight controller was designed to handle just such an anomaly, and the aircraft was able to land safely within five meters of its planned touchdown. As pilots say, any landing you can walk away from is a good landing, so we’ll chalk this one up as a win for the Ingenuity team, who we’re sure are busily writing code to prevent this from happening again.
If wobbling UAVs on another planet aren’t enough cringe for you, how about a blind mechanical demi-ostrich drunk-walking up and down a flight of stairs? The work comes from the Oregon State University and Agility Robotics, and the robot in question is called Cassie, an autonomous bipedal bot with a curious, bird-like gait. Without cameras or lidar for this test, the robot relied on proprioception, which detects the angle of joints and the feedback from motors when the robot touches a solid surface. And for ten tries up and down the stairs, Cassie did pretty well — she only failed twice, with only one counting as a face-plant, if indeed she had a face. We noticed that the robot often did that little move where you misjudge the step and land with the instep of your foot hanging over the tread; that one always has us grabbing for the handrail, but Cassie was able to power through it every time. The paper describing how Cassie was trained is pretty interesting — too bad ED-209’s designers couldn’t have read it.
So this is what it has come to: NVIDIA is now purposely crippling its flagship GPU cards to make them less attractive to cryptocurrency miners. The LHR, or “Lite Hash Rate” cards include new-manufactured GeForce RTX 3080, 3070, and 3060 Ti cards, which will now have reduced Ethereum hash rates baked into the chip from the factory. When we first heard about this a few months ago, we puzzled a bit — why would a GPU card manufacturer care how its cards are used, especially if they’re selling a ton of them. But it makes sense that NVIDIA would like to protect their brand with their core demographic — gamers — and having miners snarf up all the cards and leaving none for gamers is probably a bad practice. So while it makes sense, we’ll have to wait and see how the semi-lobotomized cards are received by the market, and how the changes impact other non-standard uses for them, like weather modeling and genetic analysis.
Speaking of crypto, we found it interesting that police in the UK accidentally found a Bitcoin mine this week while searching for an illegal cannabis growing operation. It turns out that something that uses a lot of electricity, gives off a lot of heat, and has people going in and out of a small storage unit at all hours of the day and night usually is a cannabis farm, but in this case it turned out to be about 100 Antminer S9s set up on janky looking shelves. The whole rig was confiscated and hauled away; while Bitcoin mining is not illegal in the UK, stealing the electricity to run the mine is, which the miners allegedly did.
And finally, we have no idea what useful purpose this information serves, but we do know that it’s vitally important to relate to our dear readers that yellow LEDs change color when immersed in liquid nitrogen. There’s obviously some deep principle of quantum mechanics at play here, and we’re sure someone will adequately explain it in the comments. But for now, it’s just a super interesting phenomenon that has us keen to buy some liquid nitrogen to try out. Or maybe dry ice — that’s a lot easier to source.
[Carl Bugeja] has been working on his PCB motors for more than three years now, and it doesn’t seem like he is close to running out of ideas for the project. His latest creation is a tiny Bluetooth-controlled robot built around two of these motors.
One of the main challenges of these axial flux PCB motors is their low torque output, so [Carl] had to make the robot as light as possible. The main board contains a microcontroller module with integrated Bluetooth, an IMU, regulator, and two motor drivers. The motor stator boards are soldered to the main board using 90° header pins. The frame for the body and the rotors for the motors are 3D printed. A set of four neodymium magnets and a bearing is press-fit into each rotor. The motor shafts are off-the-shelf PCB pins with one end soldered to the stator board. Power comes from a small single-cell lipo battery attached to the main board.
The robot moves, but with a jerking motion, and keeps making unintended turns. The primary cause of this seems to be the wobbly rotors, which mean that the output torque fluctuates throughout the rotation of the motor. Since there are only two points of contact to the ground, only the weight of the board and battery is preventing the central part from rotating with the motors. This doesn’t look like it’s quite enough, so [Carl] wants to experiment with using the IMU to smooth out the motion. For the next version, he’s also working on a new shaft mount, a metal rotor, and a more efficient motor design.
We look forward to seeing this in action, and also what other application [Carl] can come up with. He has already experimented with turning it into a stepper motor, a linear motor, and a tiny jigsaw motor.