Cute But Serious-Faced Automata Produce A Pour Over

robot-cafe-cartCheck out the great workmanship that went into [TonyRobot]’s coffee vending version of ROBOT CAFE at Tokyo Maker Faire 2016. We’d really like to see this in action, so if anyone has more success than we did at tracking down more info (especially if it’s video) let us know in the comments below. We spot laser-cut wood making up the clever scoop design (and the numerous gears within it) but simply must know more.

Technically this is less “robot” and more “automata“. The cart charmingly fuses vending machine practicality with a visual display… and a great one at that. The aesthetic of the Robot Cafe leaps over the uncanny valley and fully embraces lovable robot faces.

Coffee is ground by a manual-style grinder into a scoop, which is then dumped into a pour-over filter. The hot water is then raised from below to pour over the grounds. These characters can be reconfigured based on the needs of the venue. The creator page linked above has three pictures of the same cart and same robo-baristas, but they are fishing for sodas instead. The glass bottles are lifted through the hole you can see on the right of the cart’s counter, using a fishing line with a magnet to grip the metal bottle cap.

We were delighted when robot vending machines started to appear — the kind with a big glass window and a gantry that grabs your corn-syrupy beverage. But take inspiration from this. True vending nirvana is as much theater as it is utility.

[via Gizmodo Japan]

Building Pneumatic Actuators With 3D Printed Molds

Pneumatic actuators offer interesting perspectives in applications like soft robotics and interaction design. [Aidan Leitch] makes his own pneumatic actuators from silicone rubber. His actuators contain embedded air channels that can be filled with pressurized air and completely collapse to a flat sheet when no pressure is applied. Continue reading “Building Pneumatic Actuators With 3D Printed Molds”

HAL 9000 useless machine

World’s Biggest, Most Useless AI Machine

In a time when we’re inundated with talk of an impending AI apocalypse it’s nice to see an AI that’s intentionally useless. That AI is HAL 9000. No, not the conflicted HAL from the movie 2001: A Space Odyssey but the World’s Biggest AI Useless Machine HAL built by [Rafael], [Mickey] and [Eyal] for GeekCon 2016 in Israel.

Standing tall, shiny and black, the box it’s housed in reminds us a bit of the monolith from the movie. But, in a watchful position near the top is HAL’s red eye. As we approach, HAL’s voice from the movie speaks to us asking “Just what do you think you’re doing, Dave?” as the eye changes diameter in keeping with the speech’s amplitude. And at the bottom is a bright, yellow lever marked ON, which of course we just have to turn off. When we do, a panel opens up below it and a rod extends upward to turn the lever back to the ON position.

Behind the scenes are two Arduinos. One Arduino manages servos for the panel and rod as well as playing random clips of HAL from the movie. The other Arduino uses the Arduino TVout library to output to a projector that sits behind the red diffuser that is the eye. That Arduino also takes input from a microphone and based on the amplitude, has the projector project a white circle of corresponding diameter, making the eye’s appearance change. You can see all this in action in the video after the break.

Continue reading “World’s Biggest, Most Useless AI Machine”

Canary Island Team Wins World Robotic Sailing 2016

If you’re like us, you had no idea that there even was a World Robotic Sailing Championship. But we’re glad that we do now! And congratulations to the team of A-Tirma G2, the winning boat. (Link in Spanish, difficult to translate — if you can figure out how, post in the comments?)

The Championship has apparently been going on for nine years now, and moves to a different location around the world each year. The contests for 2016 (PDF) are by no means trivial. Besides a simple there-and-back regatta, the robot boats have to hold position, scan a prescribed area, and avoid a big obstacle and return quickly back to their lane. All of this with wind power, of course.

The winning boat used solid sails, which act essentially as vertical wings, and was designed for rough weather. This paid off in the area-scanning test; the winds were so strong that the organizers considered calling it off, but team A-Tirma’s boat navigated flawlessly, giving them enough points to win the event even though camera malfunction kept them from completing the obstacle avoidance.

stationkeepingtrackingUnless you’ve sailed, it’s hard to appreciate the difficulty of these challenges to an autonomous vehicle. It’s incredibly hard to plan far ahead because the boat’s motive power source, the wind, isn’t constant. But the boat has, relatively speaking, a lot of inertia and no brakes, so the robot has to plan fairly far in advance. That any of the 2-4 meter long boats could stay inside a circle of 20 meters is impressive. Oh, and did we mention that A-Tirma did all of this calculating and reacting on solar power?

Because the wind is so fickle, drone sailboats are much less popular than drone motorboats — at least using the Hackaday Blogpost Metric ™. The hackerboat project is trying out sails, but they’re still mostly working on powered propulsion. We do have an entry in the 2016 Hackaday Prize, but it’s looking like the development process is in the doldrums. Still, sailing is the best way to go in the end, because windpower is essentially free on the open ocean, which means less work for the solar panels.

As far as role-models go, you’ve basically got the entrants in the World Robotic Sailing Championships. So kudos to the A-Tirma team, and thanks [Nikito] for the tip!

Real, Life-Sized Transformers

Ever dreamed of a real, life-sized Transformer in your garage? The Turkish startup Letrons now offers you exactly that: Their animatronic Autobot drives like a car, transforms like a Transformer, and supposedly fights off space threats with its built-in smoke machine and sound effects.

Letrons’s Transformer seems to be built upon a BMW E92 coupé chassis. According to the company, the beast is packed with powerful hydraulics and servo motors, allowing it to transform and move fast. Sensors all around the chassis give it some interactivity and prevent it from crushing innocent bystanders when in remote-control mode. Interestingly, its movable arms aren’t attached to the body, but to its extendable side-wings and feature hands with actuated wrists and fingers. The Autobot also can move its head, which pops right out of the hood.

Admittedly, Letrons must have spent a lot of time on the dark side of the moon and working in secrecy before they released footage of a working and polished prototype. It’s unclear if Letron’s Transformers will cooperate with the US military in solving armed conflicts, but they are certainly good for a show. Enjoy the video below!

Continue reading “Real, Life-Sized Transformers”

Grand Theft Auto V Used To Teach Self-Driving AI

For all the complexity involved in driving, it becomes second nature to respond to pedestrians, environmental conditions, even the basic rules of the road. When it comes to AI, teaching machine learning algorithms how to drive in a virtual world makes sense when the real one is packed full of squishy humans and other potential catastrophes. So, why not use the wildly successful virtual world of Grand Theft Auto V to teach machine learning programs to operate a vehicle?

Half and Half GTAV Annotation ThumbThe hard problem with this approach is getting a large enough sample for the machine learning to be viable. The idea is this: the virtual world provides a far more efficient solution to supplying enough data to these programs compared to the time-consuming task of annotating object data from real-world images. In addition to scaling up the amount of data, researchers can manipulate weather, traffic, pedestrians and more to create complex conditions with which to train AI.

It’s pretty easy to teach the “rules of the road” — we do with 16-year-olds all the time. But those earliest drivers have already spent a lifetime observing the real world and watching parents drive. The virtual world inside GTA V is fantastically realistic. Humans are great pattern recognizers and fickle gamers would cry foul at anything that doesn’t analog real life. What we’re left with is a near-perfect source of test cases for machine learning to be applied to the hard part of self-drive: understanding the vastly variable world every vehicle encounters.

A team of researchers from Intel Labs and Darmstadt University in Germany created a program that automatically indexes the virtual world (as seen above), creating useful data for a machine learning program to consume. This isn’t a complete substitute for real-world experience mind you, but the freedom to make a few mistakes before putting an AI behind the wheel of a vehicle has the potential to speed up development of autonomous vehicles. Read the paper the team published Playing for Data: Ground Truth from Video Games.

Continue reading “Grand Theft Auto V Used To Teach Self-Driving AI”

Robotic Arm From Cardboard

Google showed the world that you could make a virtual reality headset from cardboard. We figure that might have been [Uladz] inspiration for creating a robotic arm also made out of cardboard. He says you can reproduce his design in about two hours.

You’ll need an Arduino and four hobby servo motors. The cardboard doesn’t weigh much, so you could probably use fairly small motors. In addition to the cardboard, there’s a piece of hardboard for the base and a few metal clips. You can control it all from the Arduino program or add an IR receiver if you want to run it by remote control. There’s a video of the arm–called CARDBIRD–in action, below.

Continue reading “Robotic Arm From Cardboard”