Robotic Hand With Haptic Feedback

While I was at Heatsync Labs in Mesa Arizona, [Nate] mentioned that he was really proud of helping someone build a robotic hand. I have tracked down that project because it looked pretty cool.

[Macguyver603] built this robotic hand that is controlled by a glove with flex sensors. He was originally going to 3d print the structure for the hand but the availability of the laser cutter allowed him to create something a that would be a little more structurally sound. Haptic feedback is supplied by vibrating pager motors that are triggered by sensors in the tips of the robotic hand’s fingers.

The total cost of the project was roughly $240, and there’s unfortunately no video. It did, however, earn him second place at the state fair!

Meet OSCAR, The Google Hangout Robot

[Gus] made it to the Google+ developers vlog to show off his new Google+ hangout controlled robot. This robot, named OSCAR (Overly Simplified Collaboratively Actuated Robot), drives around according to the whims of everyone in a Google+ hangout. Not only is the robot under remote control through a Google+ hangout, it also features a camera, allowing a hangout audience to explore a space in real time.

[Gus] built OSCAR out of an old Roomba he found in his parent’s basement. After attaching an Android tablet to the Roomba with some binder clips, [Gus] put a web server on the tablet and wrote a Google+ hangout extension allowing all hangout viewers to remotely control OSCAR.

Right now, all the commands received on the hangout are put into a queue, meaning everyone on a hangout has control of OSCAR. The next version will change those commands to deltas, or changes in the current state, canceling out conflicting commands. If only we had one of these while we were streaming for the Red Bull competition

You can check out a demo of OSCAR after the break.

Continue reading “Meet OSCAR, The Google Hangout Robot”

Biological-inspired Robotic Eye Movements

Researchers at Georgia Tech have developed a biologically inspired system to control cameras on board robots that simulate the Saccadic optokinetic system of the human eye. Its similarity to the muscular system of the human eye is uncanny.

Joshua Schultz, a Ph.D candidate, says that this system has been made possible in part to piezoelectric cellular actuator technology. Thanks to the actuators developed in their laboratory it is now possible to capture many of the characteristics associated with muscles of the human eye and its cellular structure.

The expectation is that the piezoelectric system could be used for future MRI-based surgery, furthering our ability to research and rehabilitate the human eye.

[via engadget]

Android Controlled Robot Extravaganza

We have no idea why, but since we featured Botiful, the Android-powered telepresence robot a few days ago, the tip line has been awash in robot/Android mashups. Here’s a few of the cool ones.

Using an Android as a remote control

[Stef] used a Samsung Galaxy S3 to control an old rc tank. The Android sends accelerometer and gyro data over Bluetooth to an Android where it powers a pair of H-bridges to turn the wheels.

Turning Android into a Robotic Operating System

ROS, or the Robot Operating System, provides a bunch of utilities for any type of robot such as point-cloud mapping to multi-joint arm control. [Lentin] sent in a guide on installing ROS on Android. So far, he can get accelerometer data, stills from the on-board camera, have the robot speak and use the small vibrator motor. Here’s a (somewhat limited) demo of [Lentin] playing with ROS in a terminal.

“Just a quick procrastination project”

Last May, [Josh] wrote in asking if a tread-based robot controlled through Skype would be a cool idea. We said ‘hell yeah’ and [Josh] scurried off to his workshop for a few months. He’s back with his tank-based robot. One really interesting bit is the robot responds to DTMF tones, allowing it to be controlled through Skype without any additional hardware. That’s damn clever. You can see a video of the SkypeRobot after the break.

Continue reading “Android Controlled Robot Extravaganza”

You Are A Sack Of Meat, Easily Punctured By Stompy

 

It may not be as cool as a bear riding a jet ski on a shark in outer space, but Stompy, the giant, rideable walking hexapod comes very close.

A few months ago, we caught wind of a gigantic rideable hexapod project brewing at the Artisan’s Asylum hackerspace in Somerville, MA. The goal was to build an 18-foot wide, two ton rideable hexapod robot, with the side benefit of teaching students how to weld, code, and other subjects related to giant machines and mechatronics.

The Stompy team has now launched a Kickstarter asking for donations to pay for the materials, plasma cutting, and other bits of hardware required to make Stompy a reality. Since there isn’t much information on amateur level hydraulics, the project is open source; the trials of building Stompy will be made public freely available for any other giant robot project.

Team Stompy has successfully built, debugged, and tested a half-size prototype of one hydraulically powered leg that is able to ‘row’ across the floor under its own power. This is a huge achievement for the team and now they’ll move on to the full-size single leg prototype.

You can see the team’s single leg prototype in action after the break.

Continue reading “You Are A Sack Of Meat, Easily Punctured By Stompy”

Another Android Controlled Roving Robot

[Sam] has been working on a cellphone controlled robot for a while now and with the launch of a few similar Kickstarter campaigns he thought it would be good to share his progress so far.

[Sam]’s robot is controlled by an Android device with the help of an IOIO dev board. This setup provides more than enough computational power to send a robot on its merry way, and has the bonus of allowing [Sam] to connect additional sensors.

The case is designed to put the headers on the IOIO board on the outside, just above a little shelf perfect for holding a breadboard or two. With the right hardware and software setup, [Sam]’s bot can rove around the neighborhood collecting data and sending it to a server in real time.

If you’re wondering why a tiny Android/IOIO powered sounds so familiar, it might be because of the Botiful robot we posted a few days ago. Unlike Botiful, [Sam] can only control his treaded Android bot through Bluetooth as the whole ‘programming a web interface’ is a bit over his head. Hopefully [Sam] will meet an enthusiastic coder when he brings his Arduino tank to Dorkbots Boston this evening.

You can check out a prototype of [Sam]’s bot in action after the break.

Continue reading “Another Android Controlled Roving Robot”

Robotic Manta Ray (Mantabot)

The Robotic Manta Ray codenamed MantaBot created by the Bio-Inspired Engineering Research Laboratory (BIER Lab) is set to make a splash. The next evolution in underwater Robotics is here. We have seen the likes of robotic fish and Jelly fish now to be added to the school is the MantaBot which has been designed to mimic the unique swimming motion of the Manta Ray,

This biologically inspired under water robot’s has been designed with a primary goal to be autonomous using its onboard electronics to make its own decisions to navigate its watery domain. BIER Lab has received major funding from the Department of Defense (DoD) Multi-disciplinary University Research Initiative (MURI) program. Part of its goal in the long run is to reverse engineer the biological systems of such creatures to the point of creating simulated artificial skin and muscle.

[Via dvice.com]

Continue reading “Robotic Manta Ray (Mantabot)”