Friday Hack Chat: Training Robots By Touch

When it comes to training robots, you could grab a joystick or carefully program movements in code. The better way, though, is to move the robot yourself, and have the robot play back all those movements ad infinitum. This is training robots by touch, and it’s the subject of this week’s Hack Chat over on hackaday.io.

Our guest for this week’s Hack Chat will be [Kent Gilson], inventor, serial entrepreneur, and pioneer in reconfigurable computing. [Kent] is the creator of Viva, an object-oriented programming language and operating environment that harnesses the power of FPGAs into general-purpose computing He’s launched eight entrepreneurial ventures, won multiple awards, and created products used in numerous industries across the globe.

[Kent]’s claim to fame on hackaday.io is Dexter, a low-cost robotic arm with 50-micron repeatability and modular end effectors. It does this with three harmonic drives and optical encoders that give it extreme precision. The arm is also trainable, meaning that you can manually control it and play back the exact path it took. It’s training robots by touch, exactly what this Hack Chat is all about.

For this Hack Chat, we’re going to be discussing:

  • Building trainable robots
  • Developing robotics haptics
  • Training robots to manufacture
  • Heterogenous direct digital manufacturing

You are, of course, encouraged to add your own questions to the discussion. You can do that by leaving a comment on the Hack Chat Event Page and we’ll put that in the queue for the Hack Chat discussion.join-hack-chat

Our Hack Chats are live community events on the Hackaday.io Hack Chat group messaging. This week is just like any other, and we’ll be gathering ’round our video terminals at noon, Pacific, on Friday, July 27th.  Need a countdown timer? Well, here you go, mango.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io.

You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.

Hackaday Prize Entry: HaptiVision Creates a Net of Vibration Motors

HaptiVision is a haptic feedback system for the blind that builds on a wide array of vibration belts and haptic vests. It’s a smart concept, giving the wearer a warning when an obstruction comes into sensor view.

The earliest research into haptic feedback wearables used ultrasonic sensors, and more recent developments used a Kinect. The project team for HaptiVision chose the Intel RealSense camera because of its svelte form factor. Part of the goal was to make the HaptiVision as discreet as possible, so fitting the whole rig under a shirt was part of the plan.

In addition to a RealSense camera, the team used an Intel Up board for the brains, mostly because it natively controlled the RealSense camera. It takes a 640×480 IR snapshot and selectively triggers the 128 vibration motors to tell you what’s close. The motors are controlled by 8 PCA9685-based PWM expander boards.

The project is based on David Antón Sánchez’s OpenVNAVI project, which also featured a 128-motor array. HaptiVision aims to create an easy to replicate haptic system. Everything is Open Source, and all of the wiring clips and motor mounts are 3D-printable.

The Hackaday Prize: Exoskeletons for the Masses

While medical facilities continue to improve worldwide, access to expensive treatments still eludes a vast amount of people. Especially when it comes to prosthetics, a lot of people won’t be able to afford something so personalized even though the need for assistive devices is extremely high. With that in mind, [Guillermo Herrera-Arcos] started working on ALICE, a robotic exoskeleton that is low-cost, easy to build, and as an added bonus, 100% Open Source.

ALICE’s creators envision that the exoskeleton will have applications in rehabilitation, human augmentation, and even gaming. Also, since it’s Open Source, it could also be used as a platform for STEM students to learn from. Currently, the team is testing electronics in the legs of the exoskeleton, but they have already come a long way with their control system and getting a workable prototype in place. Moving into the future, the creators, as well as anyone else who develops something on this platform, will always be improving it and building upon it thanks to the nature of Open Source hardware.

Robotic Arms Controlled By Your….. Feet?

The days of the third hand’s dominance of workshops the world over is soon coming to an end. For those moments when only a third hand is not enough, a fourth is there to save the day.

Dubbed MetaLimbs and developed by a team from the [Inami Hiyama Laboratory] at the University of Tokyo and the [Graduate School of Media Design] at Keio University, the device is designed to be worn while sitting — strapped to your back like a knapsack — but use while standing stationary is possible, if perhaps a little un-intuitive. Basic motion is controlled by the position of the leg — specifically, sensors attached to the foot and knee — and flexing one’s toes actuates the robotic hand’s fingers. There’s even some haptic feedback built-in to assist anyone who isn’t used to using their legs as arms.

The team touts the option of customizeable hands, though a soldering iron attachment may not be as precise as needed at this stage. Still, it would be nice to be able to chug your coffee without interrupting your work.

Continue reading “Robotic Arms Controlled By Your….. Feet?”

Keeping Humanity Safe from Robots at Disney

Almost every big corporation has a research and development organization, so it came as no surprise when we found a tip about Disney Research in the Hackaday Tip Line. And that the project in question turned out to involve human-safe haptic telepresence robots makes perfect sense, especially when your business is keeping the Happiest Place on Earth running smoothly.

That Disney wants to make sure their Animatronics are safe is good news, but the Disney project is about more than keeping guests healthy. The video after the break and the accompanying paper (PDF link) describe a telepresence robot with a unique hydrostatic transmission coupling it to the operator. The actuators are based on a rolling-diaphragm design that limits hydraulic pressure. In a human-safe system that’s exactly what you want.

The system is a hybrid hydraulic-pneumatic design; two actuators, one powered by water pressure and the other with air, oppose each other in each joint. The air-charged actuators behave like a mass-efficient spring that preloads the hydraulic actuator. This increases safety by allowing the system to be de-energized instantly by venting the air lines. What’s more, the whole system presents very low mechanical impedance, allowing haptic feedback to the operator through the system fluid. This provides enough sensitivity to handle an egg, thread a needle — or even bop a kid’s face with impunity.

There are some great ideas here for robotics hackers, and you’ve got to admire the engineering that went into these actuators. For more research from the House of Mouse, check out this slightly creepy touch-sensitive smart watch, or this air-cannon haptic feedback generator.

Continue reading “Keeping Humanity Safe from Robots at Disney”