While we wish colleges and universities competed more on academics, we can’t deny that more people are interested in their athletics programs. Oregon State, however, has done a little of both since their bipedal robot, Cassie, became the world’s fastest bipedal robot according to the Guinness Book of World Records. You can see a video of the 100 meter run below, but don’t blink. The robot turned in a time of around 25 seconds.
Impressive, but still not on par with Usan Bolt’s time of under 10 seconds for the same distance. If you want to see what that would be like, try running the long way across a football field and see how far you get in 25 seconds. There isn’t a lot of technical detail about the robot, but you can intuit some things from watching it go. You can also find a little more information on the robot and some of its siblings on the University’s website.
If you think robots won’t ever run as well as humans, we used to think the same thing about playing chess. This doesn’t look like we normally envision a bipedal robot. Then again, there isn’t any reason robots have to look, or move, like we do.
LEONARDO, a somewhat tortured name derived from “LEgs ONboARD drOne,” is actually just what it appears to be: a quadcopter with a set of legs. It comes to us from Caltech’s Center for Autonomous Systems and Technologies, and the video below makes it easy to see what kind of advantages a kinematic mash-up like this would offer. LEO combines walking and flying to achieve a kind of locomotion that looks completely alien, kind of a bouncy, tip-toeing step that really looks like someone just learning how to walk in high heels. The upper drone aspect of LEO provides a lot of the stabilization needed for walking; the thrust from the rotors is where that bouncy compliance comes from. But the rotors can also instantly ramp up the thrust so LEO can fly over obstacles, like stairs. It’s also pretty good at slacklining and skateboarding, too.
It’s easy to see how LEO’s multimodal locomotion system solves — or more accurately, avoids — a number of the problems real-world bipedal robots are going to experience. For now, LEO is pretty small — only about 30″ (76 cm) tall. And it’s rather lightly constructed, as one would expect for something that needs to fly occasionally. But it’s easy to see how something like this could be scaled up, at least to a point. And LEO’s stabilization system might be just what its drunk-walking cousin needs.
Ever since humanity has grasped the idea of a robot, we’ve wanted to imagine them into walking humanoid form. But making a robot walk like a human is not an easy task, and even the best of them end up with the somewhat shuffling gait of a Honda Asimo rather than the graceful poise of a balerina. Only in recent years have walking robots appeared to come of age, and then not by mimicking the human gait but something more akin to a bird.
We’ve seen it in the Boston Dynamics models, and also now in a self-balancing two-legged robot developed at Oregon State University that has demonstrated its abilities by completing an unaided 5 km run having used its machine learning skills to teach itself to run from scratch. It’s believed to be the first time a robot has achieved such a feat without first being programmed for the specific task.
The university’s PR piece envisages a time in which walking robots of this type have become commonplace, and when humans interact with them on a daily basis. We can certainly see that they could perform a huge number of autonomous outdoor tasks that perhaps a wheeled robot might find to be difficult, so maybe they have a bright future. Decide for yourself, after watching the video below the break.
That collective “Phew!” you heard this week was probably everyone on the Mars Ingenuity helicopter team letting out a sigh of relief while watching telemetry from the sixth and somewhat shaky flight of the UAV above Jezero crater. With Ingenuity now in an “operations demonstration” phase, the sixth flight was to stretch the limits of what the craft can do and learn how it can be used to scout out potential sites to explore for its robot buddy on the surface, Perseverance.
While the aircraft was performing its 150 m move to the southwest, the stream from the downward-looking navigation camera dropped a single frame. By itself, that wouldn’t have been so bad, but the glitch caused subsequent frames to come in with the wrong timestamps. This apparently confused the hell out of the flight controller, which commanded some pretty dramatic moves in the roll and pitch axes — up to 20° off normal. Thankfully, the flight controller was designed to handle just such an anomaly, and the aircraft was able to land safely within five meters of its planned touchdown. As pilots say, any landing you can walk away from is a good landing, so we’ll chalk this one up as a win for the Ingenuity team, who we’re sure are busily writing code to prevent this from happening again.
If wobbling UAVs on another planet aren’t enough cringe for you, how about a blind mechanical demi-ostrich drunk-walking up and down a flight of stairs? The work comes from the Oregon State University and Agility Robotics, and the robot in question is called Cassie, an autonomous bipedal bot with a curious, bird-like gait. Without cameras or lidar for this test, the robot relied on proprioception, which detects the angle of joints and the feedback from motors when the robot touches a solid surface. And for ten tries up and down the stairs, Cassie did pretty well — she only failed twice, with only one counting as a face-plant, if indeed she had a face. We noticed that the robot often did that little move where you misjudge the step and land with the instep of your foot hanging over the tread; that one always has us grabbing for the handrail, but Cassie was able to power through it every time. The paper describing how Cassie was trained is pretty interesting — too bad ED-209’s designers couldn’t have read it.
So this is what it has come to: NVIDIA is now purposely crippling its flagship GPU cards to make them less attractive to cryptocurrency miners. The LHR, or “Lite Hash Rate” cards include new-manufactured GeForce RTX 3080, 3070, and 3060 Ti cards, which will now have reduced Ethereum hash rates baked into the chip from the factory. When we first heard about this a few months ago, we puzzled a bit — why would a GPU card manufacturer care how its cards are used, especially if they’re selling a ton of them. But it makes sense that NVIDIA would like to protect their brand with their core demographic — gamers — and having miners snarf up all the cards and leaving none for gamers is probably a bad practice. So while it makes sense, we’ll have to wait and see how the semi-lobotomized cards are received by the market, and how the changes impact other non-standard uses for them, like weather modeling and genetic analysis.
Speaking of crypto, we found it interesting that police in the UK accidentally found a Bitcoin mine this week while searching for an illegal cannabis growing operation. It turns out that something that uses a lot of electricity, gives off a lot of heat, and has people going in and out of a small storage unit at all hours of the day and night usually is a cannabis farm, but in this case it turned out to be about 100 Antminer S9s set up on janky looking shelves. The whole rig was confiscated and hauled away; while Bitcoin mining is not illegal in the UK, stealing the electricity to run the mine is, which the miners allegedly did.
And finally, we have no idea what useful purpose this information serves, but we do know that it’s vitally important to relate to our dear readers that yellow LEDs change color when immersed in liquid nitrogen. There’s obviously some deep principle of quantum mechanics at play here, and we’re sure someone will adequately explain it in the comments. But for now, it’s just a super interesting phenomenon that has us keen to buy some liquid nitrogen to try out. Or maybe dry ice — that’s a lot easier to source.
Some days, we might be forgiven for believing Boston Dynamics has cornered the market on walking robots. They (and other players) are making incredible progress in their field, but three years ago Disney, trying to create autonomous, free-walking robotic actors for some of their more diminutive film characters, found none of the existing platforms were appropriate. So they set their Imagineering department to work on “Project Kiwi”, and we are now seeing the fruits of those efforts.
Research on bipedal robots has amassed over the years, and as the saying goes, if these Imagineers saw further it was by standing on the shoulders of larger robotic platforms. However, the Project Kiwi designers have made a laundry list of innovations in their process of miniaturization, from the “marrow conduit” cooling system which forces air through hollow bones, to gearing that allows actuators to share motors even across joints. The electronics are distributed around the skeleton on individual PCBs with ribbon flex cables to reduce wiring, and almost every component is custom fabricated to meet the complex size and weight requirements.
Even in this early prototype, Disney’s roots in life-like animatronics are evident. Groot’s movements are emotive, if a bit careful, and software can express a variety of personalities through his gaits and postures. The eyes and face are as expressive as we’ve come to expect (though a keen eye for seams puts off some definite Westworld vibes). Reportedly, this version can handle gentle shoves and contact, but we do spot a safety cable still attached to the head. So there’s probably some way to go before we’ll see this interacting with the general public in a park.
It should come as no surprise that we here at Hackaday are big boosters of autonomous systems like self-driving vehicles. That’s not to say we’re without a healthy degree of skepticism, and indeed, the whole point of the “Automate the Freight” series is that economic forces will create powerful incentives for companies to build out automated delivery systems before they can afford to capitalize on demand for self-driving passenger vehicles. There’s a path to the glorious day when you can (safely) nap on the way to work, but that path will be paved by shipping and logistics companies with far deeper pockets than the average commuter.
So it was with some interest that we saw a flurry of announcements in the popular press recently regarding automated deliveries. Each by itself wouldn’t be worthy of much attention; companies are always maneuvering to be seen as ahead of the curve on coming trends, and often show off glitzy, over-produced videos and well-crafted press releases as a low-effort way to position themselves as well as to test markets. But seeing three announcements at one time was unusual, and may point to a general feeling by manufacturers that automated deliveries are just around the corner. Plus, each story highlighted advancements in areas specifically covered by “Automate the Freight” articles, so it seemed like a perfect time to review them and perhaps toot our own horn a bit.
Fans of technology will recall a number of years when Honda’s humanoid robot Asimo seemed to be everywhere. In addition to its day job in a research lab, Asimo had a public relations side gig showing everyone that Honda is about more than cars and motorcycles. From trade shows to television programs, even amusement parks and concert halls, Asimo worked a busy publicity schedule. Now a retirement party may be in order, since the research project has reportedly been halted.
Asimo’s activity has tapered off in recent years so this is not a huge surprise. Honda’s official Asimo site itself hasn’t been updated in over a year. Recent humanoid robots in media are more likely to be in context of events like DARPA Robotics Challenge or from companies like Boston Dynamics. Plus the required technology has become accessible enough for us to build our own two-legged robots. So its torch has been passed on, but Asimo would be remembered as the robot who pioneered a lot of thinking into how humanoid robots would interact with flesh and blood humans. It was one of the first robots who could recognize human waving as a gesture, and wave back in return.
Many concepts developed from Asimo will live on as Honda’s research team shift focus to less humanoid form factors. We can see Honda’s new ambitions in their concept video released during CES 2018 (embedded below.) These robots are still designed to live and work alongside people, but now they are specialized to different domains and they travel on wheels. Which is actually a step closer to the Jetsons’ future, because Rosie rolls on wheels!