Real Or Fake? Robot Uses AI To Find Waldo

The last few weeks have seen a number of tech sites reporting on a robot which can find and point out Waldo in those “Where’s Waldo” books. Designed and built by Redpepper, an ad agency. The robot arm is a UARM Metal, with a Raspberry Pi controlling the show.

A Logitech c525 webcam captures images, which are processed by the Pi with OpenCV, then sent to Google’s cloud-based AutoML Vision service. AutoML is trained with numerous images of Waldo, which are used to attempt a pattern match.  If a pattern is found, the coordinates are fed to PYUARM, and the UARM will literally point Waldo out.

While this is a totally plausible project, we have to admit a few things caught our jaundiced eye. The Logitech c525 has a field of view (FOV) of 69°. While we don’t have dimensions of the UARM Metal, it looks like the camera is less than a foot in the air. Amazon states that “Where’s Waldo Delux Edition” is 10″ x 0.2″ x 12.5″ inches. That means the open book will be 10″ x 25″. The robot is going to have a hard time imaging a surface that large in a single image. What’s more, the c525 is a 720p camera, so there isn’t a whole lot of pixel density to pattern match. Finally, there’s the rubber hand the robot uses to point out Waldo. Wouldn’t that hand block at least some of the camera’s view to the left?

We’re not going to jump out and call this one fake just yet — it is entirely possible that the robot took a mosaic of images and used that to pattern match. Redpepper may have used a bit of movie magic to make the process more interesting. What do you think? Let us know down in the comments!

This Is The Raspberry Pi Robot To Beat All Others

Before the introduction of the Raspberry Pi, building robots was hard. The best solution to turning motors on a chassis was repurposing an old roomba. For the brain, maybe you could throw Linux on a router and move your rover around with an old Linksys. Before that, you could buy a crappy robotics kit, thrown together in a box and sold as an ‘educational kit’. I’m sure there are a few readers out there that built robots by wire-wrapping HC11s.

Now we have 3D printers and Raspberry Pis, and with that comes a golden age of robotics. One of the best robot brains out there is the 8BitRobots Modules from [Tim Wilkinson], an entry for this year’s Hackaday Prize.

The 8BitRobots Modules are made up of a few components, not the least of which is a Pi Zero, a fantastically powerful (for its price) Linux computer that is available for five dollars. With an add-on board, cleverly named the RoBonnet, the Pi Zero gets PWM outputs for servos and ESCs, an H-bridge for motors, TTL serial, encoder inputs, a pressure and temperature sensor, an IMU, a power monitor, and everything else you need for a successful Pi robot.

But hardware is only one part of the equation. If you want to program a robot, you need a software stack that makes everything easy. That’s where the 8BitRobots distributed robot platform comes in. This is a bit of Javascript running on the Pi that allows you to program the robot in Blockly, a Scratch-like graphical programming environment that’s been adapted to run in a web browser. It’s an all-in-one solution to robotics development and programming, and an excellent addition to this year’s Hackaday Prize.

NASA Wants You… To Design Their Robot

No one loves a good competition more than Hackaday. We run enough to keep anyone busy. But if you have a little spare time after designing your one inch PCB, you might check out the competition to develop a robotic arm for NASA’s Astrobee robot.

Some of the challenges are already closed, but there are quite a few still open for a few more months (despite the published closing date of and these look like great projects for a hacker. In particular, the software architecture and command, data, and power system are yet to start.

But don’t let the $25,000 fool you. That’s spread out over a number of awards for the entire series. Each task has an award that ranges from $250 to $5,000. However, you also have to win that award, of course. If you register, however, you do get a sticker that has flown on the space station.

If you haven’t seen Astrobee, it is a flying robot made to assist astronauts and cosmonauts on the International Space Station. The robot is really a floating sensor platform that can do some autonomous tasks but can also act as a telepresence robot for flight controllers. You might enjoy the second video below if you haven’t seen Astrobee, before.

We covered the Astrobee before. If you’d like to visit the space station yourself, it isn’t quite telepresence, but Google can help you out.

Continue reading “NASA Wants You… To Design Their Robot”

DARPA Goes Underground For Next Challenge

We all love reading about creative problem-solving work done by competitors in past DARPA robotic challenges. Some of us even have ambition to join the fray and compete first-hand instead of just reading about them after the fact. If this describes you, step on up to the DARPA Subterranean Challenge.

Following up on past challenges to build autonomous vehicles and humanoid robots, DARPA now wants to focus collective brainpower solving problems encountered by robots working underground. There will be two competition tracks: the Systems Track is what we’ve come to expect, where teams build both the hardware and software of robots tackling the competition course. But there will also be a Virtual Track, opening up the challenge to those without resources to build big expensive physical robots. Competitors on the virtual track will run their competition course in the Gazebo robot simulation environment. This is similar to the NASA Space Robotics Challenge, where algorithms competed to run a virtual robot through tasks in a simulated Mars base. The virtual environment makes the competition accessible for people without machine shops or big budgets. The winner of NASA SRC was, in fact, a one-person team.

Back on the topic of the upcoming DARPA challenge: each track will involve three sub-domains. Each of these have civilian applications in exploration, infrastructure maintenance, and disaster relief as well as the obvious military applications.

  • Man-made tunnel systems
  • Urban underground
  • Natural cave networks

There will be a preliminary circuit competition for each, spaced roughly six months apart, to help teams get warmed up one environment at a time. But for the final event in Fall of 2021, the challenge course will integrate all three types.

More details will be released on Competitor’s Day, taking place September 27th 2018. Registration for the event just opened on August 15th. Best of luck to all the teams! And just like we did for past challenges, we will excitedly follow progress. (And have a good-natured laugh at fails.)

Line Following Robot Without The Lines

Line-following robots are a great intro to robotics in general, since the materials and skills needed to build a good one aren’t too advanced. It turns out that line-following robots are more than just a learning tool, too. They’re pretty useful in industry, but most of them don’t follow visible marked lines. Some, like this inductive guided robot from [Randall] make use of wires to determine their paths.

Some of the benefits of inductive guidance over physical lines are that the wires can be hidden in floors, so if something like an automated forklift is using them at a warehouse there will be less trip hazard and less maintenance of the guides. They also support multiple paths, so no complicated track switching has to take place. [Randall]’s robot is a small demonstration of a larger system he built as a technician for an autonomous guided vehicle system. His video goes into the details of how they work, more of their advantages and disadvantages, and a few other things.

While inductive guided robots have been used for decades now, they’re starting to be replaced by robots with local positioning systems and computer vision. We’ve recently seen robots that are built to utilize these forms of navigation as well.

Continue reading “Line Following Robot Without The Lines”

Six Wheels (En)rolling: Mars Rovers Going To School

Few things build excitement like going to space. It captures the imagination of young and old alike. Teachers love to leverage the latest space news to raise interest in their students, and space agencies are happy to provide resources to help. The latest in a long line of educator resources released by NASA is an Open Source Rover designed at Jet Propulsion Laboratory.

JPL is the birthplace of Mars rovers Sojourner, Spirit, Opportunity, and Curiosity. They’ve been researching robotic explorers for decades, so it’s no surprise they have many rovers running around. The open source rover’s direct predecessor is ROV-E, whose construction process closely followed procedures for engineering space flight hardware. This gave a team of early career engineers experience in the process before they built equipment destined for space. In addition to learning various roles within a team, they also learned to work with JPL resources like submitting orders to the machine shop to make ROV-E parts.

Once completed, ROV-E became a fixture at JPL public events and occasionally visits nearby schools as part of educational outreach programs. And inevitably a teacher at the school would ask “The kids love ROV-E! Can we make our own rover?” Since most schools don’t have 5-axis CNC machines or autoclaves to cure carbon fiber composites, the answer used to be “No.”

Until now.

Continue reading “Six Wheels (En)rolling: Mars Rovers Going To School”

Robot Rovers Of The Early Space Race

In the early 1970s, the American space program was at a high point, having placed astronauts upon the surface of the moon while their Soviet competitors had not taken them beyond an Earth orbit. It is however a simplistic view to take this as meaning that NASA had the lead in all aspects of space exploration, because while Russians had not walked the surface of our satellite they had achieved a less glamorous feat of lunar exploration that the Americans had not. The first Lunokhod wheeled rover had reached the lunar surface and explored it under the control of earth-bound engineers in the closing months of 1970, and while the rovers driven by Apollo astronauts had placed American treadmarks in the  lunar soil and been reproduced on newspaper front pages and television screens worldwide, they had yet to match the Soviet achievements with respect to autonomy and remote control.

At NASA’s Jet Propulsion Laboratory there was a project to develop technology for future American rovers under the leadership of [Dr. Ewald Heer], and we have a fascinating insight into it thanks to the reminiscences of [Mike Blackstone], then a junior engineer.

The aim of the project was to demonstrate the feasibility of a rover exploring a planetary surface, picking up, and examining rocks. Lest you imagine a billion dollar budget for gleaming rover prototypes, it’s fair to say that this was to be achieved with considerably more modest means. The rover was a repurposed unit that had previously been used for remote handling of hazardous chemicals, and the project’s computer was an extremely obsolete DEC PDP-1.

We are treated to an in-depth description of the rover and its somewhat arcane control system. Sadly we have no pictures save for his sketches as the whole piece rests upon his recollections, but it sounds an interesting machine in its own right. Heavily armoured against chemical explosions, its two roughly-humanoid arms were operated entirely by chains similar to bicycle chains, with all motors resting in its shoulders. A vision system was added in the form of a pair of video cameras on motorised mounts, these could be aimed at an object using a set of crosshairs on each of their monitors, and their angles read off manually by the operator from the controls. These readings could then be entered into the PDP-1, upon which the software written by [Mike] could calculate the position of an object, calculate the required arm positions to retrieve it, and command the rover to perform the required actions.

The program was a success, producing a film for evaluation by the NASA bigwigs. If it still exists it would be fascinating to see it, perhaps our commenters may know where it might be found. Meanwhile if the current JPL research on rovers interests you, you might find this 2017 Hackaday Superconference talk to be of interest.

Thanks [JRD] for the tip.