Robot’s Actions And Our Reactions

If you walk into a dog owner’s home that dog is probably going to make a beeline to see if you are a threat. If you walk into a cat owner’s home, you may see the cat wandering around, if it even chooses to grace you with its presence. For some people, a dog’s direct approach can be nerve-wracking, or even scary depending on their history and relative size of the dog. Still, these domestic animals are easy to empathize with especially if you or your family have a pet. They have faces which can convey curiosity or smug indifference but what if you were asked to judge the intent of something with no analogs to our own physical features like a face or limbs? That is what researchers at the IDC Herzliya in Israel and Cornell University in the US asked when they made the Greeting Machine to move a moon-like sphere around a planet-like sphere.

Participants were asked to gauge their feelings about the robot after watching the robot move in different patterns. It turns out that something as simple as a sphere tracing across the surface of another sphere can stir consistent and predictable emotions in people even though the shapes do not resemble a human, domestic pet, or anything but a snowman’s abdomen. This makes us think about how our own robots must be perceived by people who are not mired in circuits all day. Certainly, a robot jellyfish lazing about in the Atlantic must feel less threatening than a laser pointer with a taste for human eyeballs.

 

Continue reading “Robot’s Actions And Our Reactions”

Robot Arm Is A Fast Learner

Not long ago, machines grew their skills when programmers put their noses to the grindstone and mercilessly attacked those 104 keys. Machine learning is turning some of that around by replacing the typing with humans demonstrating the actions they want the robot to perform. Suddenly, a factory line-worker can be a robot trainer. This is not new, but a robot needs thousands of examples before it is ready to make an attempt. A new paper from researchers at the University of California, Berkeley, are adding the ability to infer so robots can perform after witnessing a task just one time.

A robotic arm with no learning capability can only be told to go to (X,Y,Z), pick up a thing, and drop it off at (X2, Y2, Z2). Many readers have probably done precisely this in school or with a homemade arm. A learning robot generates those coordinates by observing repeated trials and then copies the trainer and saves the keystrokes. This new method can infer that when the trainer picks up a piece of fruit, and drops it in the red bowl, that the robot should make sure the fruit ends up in the red bowl, not just the location where the red bowl was before.

The ability to infer is built from many smaller lessons, like moving to a location, grasping, and releasing and those are trained with regular machine learning, but the inference is the glue that holds it all together. If this sounds like how we teach children or train workers, then you are probably thinking in the right direction.

Continue reading “Robot Arm Is A Fast Learner”

Arduino Fights Fire With… Water?

We don’t think we’d want to trust our fire safety to a robot carrying a few ounces of water, but as a demonstration or science project, [Tinker Guru’s] firefighting robot was an entertaining answer to the question: “What do I do with that flame sensor that came in the big box of Arduino sensors I bought from China?” You can see a video of the device below.

You can see, it is a pretty standard two-wheel robot with the drive wheels to the rear and a skid plate up front. There are a flame sensor and a water pump up forward, as well. You can probably guess, the device notices a flame and rushes to squirt water on it.

Continue reading “Arduino Fights Fire With… Water?”

Can This Fire Fighting Robot Take The Heat?

Firefighting is a difficult and dangerous job, which puts humans on the front line to save life and property on a regular basis. It’s a prime candidate for some robot helpers, and [Ivan] has stepped in with a fun build that, while it won’t be serving in your municipal department any time soon, gets us thinking about the possibilities.

It’s a radio controlled robot with an Arduino Uno for the brains. A couple of motor driver boards are used to run four windscreen wiper motors for propulsion. Long before the days of online shopping, the wiper motor was a hacker staple – a cheap, readily available high torque motor that could be easily driven for a range of hobby projects. They say only 90’s kids remember.

As far as water delivery goes, this robot is a little short on credentials, carrying only 1 litre of water. However, we appreciate [Ivan]’s use of a Tupperware container as a tank – with a few add-on fittings, this could be a great way to hold water in other projects. The small DC-powered pump is controlled by an industrial solid state relay – a good choice for a robot that may get wet. There’s an onboard CO2 extinguisher as well, but it’s sadly not plumbed into anything just yet.

This build is an [Ivan] classic – big, fun, and 3D printed on a much larger scale then we’re used to. It’s a strong  follow up to his impressive tank build we saw earlier. Video after the break.

[Thanks to Baldpower for the tip!]

Continue reading “Can This Fire Fighting Robot Take The Heat?”

A Star-Trek-Inspired Robot With Raspberry Pi And AI

When [314Reactor] got a robot car kit, he knew he wanted to add some extra things to it. At about the same time he was watching a Star Trek episode that featured exocomps — robots that worked in dangerous areas. He decided to use those fictional devices to inspire his modifications to the car kit. Granted, the fictional robots were intelligent and had a replicator. So you know he won’t make an actual working replica. But then again, the ones on the TV show didn’t have all that either.

A Raspberry Pi runs Tensorflow using the standard camera.  This lets it identify objects of interest (assuming it gets them right) and sends the image back to the operator along with some identifying information. The kit already had an Arduino onboard and the new robot talks to it via a serial port. You can see a video about the project, below.

Continue reading “A Star-Trek-Inspired Robot With Raspberry Pi And AI”

Hexagrow Robot Packs A Serious Sensor Package

Automation is a lofty goal in many industries, but not always straightforward to execute. Welding car bodies in the controlled environment of a production line is relatively straightforward. Maintaining plants in a greenhouse, however, brings certain complexities due to the unpredictable organic processes at play. Hexagrow is a robot that aims to study automation in this area, developed as the final year project of [Mithira Udugama] and team.

The robot’s chassis is a very modern build, consisting of carbon fiber panels and 3D printed components. This kind of strength is perhaps overkill for the application, but it makes for a very light and rigid robot when the materials are used correctly.

Testing soil pH isn’t easy, but Hexagrow is up to the challenge.

It’s the sensor package where this build really shines, however. There’s the usual accoutrement of temperature and humidity sensors, and a soil moisture probe, as we’d expect. But there’s more, including an impressive soil pH tester. This involves a robotic arm with a scoop to collect soil samples, which are then weighed by a load cell. This is then used to determine the correct amount of water to add to the sample. The mixture is then agitated, before being tested by the probe to determine the pH level. It recalls memories of the science packages on Mars rovers, and it’s great to see this level of sophistication in a university project build. There’s even a LIDAR mounted on top for navigation purposes, though it’s not clear as to whether this sensor is actually functionally used at this point in development.

Plants can be demanding of their caretakers, so perhaps you’d best check you’re measuring your soil moisture correctly? Video after the break.

[Thanks to Baldpower for the tip!]

Continue reading “Hexagrow Robot Packs A Serious Sensor Package”

This Robot Swims, Skates, And Crawls

You often hear that art imitates life, but sometimes technology does too. Pliant Energy Systems’ Velox robot resembles an underwater creature more than it does a robot because it uses undulating fins to propel itself, as you can see in the video below.

The video shows the beast skating, but also swimming, and walking. It really does look more like a lifeform than a device. According to the company, the robot has excellent static thrust/watt and is resistant to becoming entangled in plants and other debris.

Continue reading “This Robot Swims, Skates, And Crawls”