Travelling The Oregon Trail With An Apple II Robot

For one reason or another, we’re going with a retro-futuristic 80s aesthetic in this case, [Mike] decided to turn an Apple IIe into a robot. If you have to ask why, you’ll never know, but this project does have some interesting things going for it. There’s a voice synthesizer, a brand spankin’ new power supply, and it rolls around on the floor thanks to Apple BASIC.

Since this is a mobile robot, there needs to be a power supply in there somewhere. The Apple II had a fantastic switching power supply, but it ran off mains voltage. To make this Apple run off a 14.8 V LiPO battery, [Mike] needed to re-engineer this power supply to give +5, +12, -5, and -12 Volts. The easiest is the positive voltage, and for that, he used a big ‘ol LM1084 linear regulator for the +5 V line. This outputs a ton of heat and probably isn’t the best solution, but it is a solution that works. The +12 line was again another linear regulator, an LM7812CV. Since this is dropping 14.8 V down to 12, the efficiency isn’t that bad, and since there’s no floppy drive it’s not pulling much current anyway. The negative voltages are a MAX764 / MAX765 inverting switching regulators. This completely replaces the original power supply in the Apple II, and is a decent reference design for anyone who wants to make a luggable Apple II laptop.

To move this thing around, the motors run on their own 11.1 V LiPO, with a bunch of Pololu gear tying everything together. The BASIC code was written on an emulator, transferred over with the Floppy Emu. Movement is controlled through the output pins on the joystick port, and there’s a text to speech module that was obviously needed and ties this project together wonderfully. You can check out the video demo of the build below.

Continue reading “Travelling The Oregon Trail With An Apple II Robot”

Automate the Freight: Amazon Tackles the Last Mile Problem On Wheels

We’ve been occasionally exploring examples of what could be the killer application for self-driving vehicles: autonomous freight deliveries, both long-haul and local, as well as some special use cases. Some, like UAV delivery of blood and medical supplies in Kenya, have taken off and are becoming both profitable and potentially life-saving. Others, like driverless long-haul trucking, made an initial splash but appear to have gone quiet since then. This is to be expected, as the marketplace picks winners and losers in a neverending quest to maximize return on investment. But the whole field seems to have gotten a bit sleepy lately, with no big news of note for quite a while.

That changed last week with Amazon’s announcement of Scout, their autonomous delivery vehicle. Announced first on Amazon’s blog and later picked up by the popular and tech press who repeated the Amazon material almost verbatim, Scout appears at first glance to be a serious attempt by Amazon to own the “last mile” of delivery – the local routes that are currently plied by the likes of UPS, FedEx, and various postal services. Or is it?

Continue reading “Automate the Freight: Amazon Tackles the Last Mile Problem On Wheels”

This 3D Printer Is Soft On Robots

It always seems to us that the best robots mimic things that are alive. For an example look no further than the 3D printed mesh structures from researchers at North Carolina State University. External magnetic fields make the mesh-like “robot” flex and move while floating in water. The mechanism can grab small objects and carry something as delicate as a water droplet.

The key is a viscous toothpaste-like ink made from silicone microbeads, iron carbonyl particles, and liquid silicone. The resulting paste is amenable to 3D printing before being cured in an oven. Of course, the iron is the element that makes the thing sensitive to magnetic fields. You can see several videos of it in action, below.

Continue reading “This 3D Printer Is Soft On Robots”

Badland Brawler Lets Arduino Tackle Terrain

For an electronics person, building the mechanics of a robot — especially a robust robot — can be somewhat daunting. [Jithin] started with an off-the-shelf 4 wheel drive chassis to build an off-road Arduino robot he calls the Badland Brawler. The kit was a bit over $100, but as you can see in the video below, it is pretty substantial, with an enclosed frame and large mud tires.

The remaining parts include an Arduino, a battery, and a motor driver IC. The Arduino is one with WiFi (an MKR 1000, in fact) and there’s a phone app for controlling the robot.

Honestly, once you have the chassis taken care of, the rest is pretty easy. Of course, the phone app is a bit more effort, but you could replace it in a number of ways. Blynk, comes to mind, for example.

The motor drivers are easy to figure out. This would be a great platform for some sensors to allow for more autonomy. We liked how the frame had mount points for a lot of different boards and sensors and could hold everything, for the most part, inside. That’s probably a good idea for a robot which will be traversing rugged terrain.

If you do decide to roll your own app with Blynk, we’ve done it with a very different kind of robot. Four-wheel drive robots don’t have to be big, as we’ve seen in the past.

Continue reading “Badland Brawler Lets Arduino Tackle Terrain”

Robot Can’t Take Its Eyes Off The Bottle

Robots, as we currently understand them, tend to run on electricity. Only in the fantastical world of Futurama do robots seek out alcohol as both a source of fuel and recreation. That is, until [Les Wright] and his beer seeking robot came along. (YouTube, video after the break.)

A Raspberry Pi 3 provides the brains, with an Intel Neural Compute stick plugged in as an accelerator for neural network tasks. This hardware, combined with the OpenCV image detection software, enable the tracked robot to identify objects and track their position accordingly.

That a beer bottle was chosen is merely an amusing aside – the software can readily identify many different object categories. [Les] has also implemented a search feature, in which the robot will scan the room until a target bottle is identified. The required software and scripts are available on GitHub for your perusal.

Over the past few years, we’ve seen an explosion in accelerator hardware for deep learning and neural network computation. This is, of course, particularly useful for robotics applications where a link to cloud services isn’t practical. We look forward to seeing further development in this field – particularly once the robots are able to open the fridge, identify the beer, and deliver it to the couch in one fell swoop. The future will be glorious!

 

Continue reading “Robot Can’t Take Its Eyes Off The Bottle”

Robot’s Actions and Our Reactions

If you walk into a dog owner’s home that dog is probably going to make a beeline to see if you are a threat. If you walk into a cat owner’s home, you may see the cat wandering around, if it even chooses to grace you with its presence. For some people, a dog’s direct approach can be nerve-wracking, or even scary depending on their history and relative size of the dog. Still, these domestic animals are easy to empathize with especially if you or your family have a pet. They have faces which can convey curiosity or smug indifference but what if you were asked to judge the intent of something with no analogs to our own physical features like a face or limbs? That is what researchers at the IDC Herzliya in Israel and Cornell University in the US asked when they made the Greeting Machine to move a moon-like sphere around a planet-like sphere.

Participants were asked to gauge their feelings about the robot after watching the robot move in different patterns. It turns out that something as simple as a sphere tracing across the surface of another sphere can stir consistent and predictable emotions in people even though the shapes do not resemble a human, domestic pet, or anything but a snowman’s abdomen. This makes us think about how our own robots must be perceived by people who are not mired in circuits all day. Certainly, a robot jellyfish lazing about in the Atlantic must feel less threatening than a laser pointer with a taste for human eyeballs.

 

Continue reading “Robot’s Actions and Our Reactions”

Robot Arm is a Fast Learner

Not long ago, machines grew their skills when programmers put their noses to the grindstone and mercilessly attacked those 104 keys. Machine learning is turning some of that around by replacing the typing with humans demonstrating the actions they want the robot to perform. Suddenly, a factory line-worker can be a robot trainer. This is not new, but a robot needs thousands of examples before it is ready to make an attempt. A new paper from researchers at the University of California, Berkeley, are adding the ability to infer so robots can perform after witnessing a task just one time.

A robotic arm with no learning capability can only be told to go to (X,Y,Z), pick up a thing, and drop it off at (X2, Y2, Z2). Many readers have probably done precisely this in school or with a homemade arm. A learning robot generates those coordinates by observing repeated trials and then copies the trainer and saves the keystrokes. This new method can infer that when the trainer picks up a piece of fruit, and drops it in the red bowl, that the robot should make sure the fruit ends up in the red bowl, not just the location where the red bowl was before.

The ability to infer is built from many smaller lessons, like moving to a location, grasping, and releasing and those are trained with regular machine learning, but the inference is the glue that holds it all together. If this sounds like how we teach children or train workers, then you are probably thinking in the right direction.

Continue reading “Robot Arm is a Fast Learner”