Step into the Box

thebox

Take three industrial robots, two 4’ x 8’ canvases, and several powerful video projectors. Depending on who is doing the robot programming you may end up with a lot of broken glass and splinters, or you may end up with The Box.  The latest video released by the creators project, The Box features industrial robots and projection mapping. We recently featured Disarm from the same channel.

The Box is one of those cases of taking multiple existing technologies and putting them together with breathtaking results. We can’t help but think of the possibilities of systems such as CastAR while watching the video. The robots move two large canvases while projectors display a series of 3D images on them. A third robot moves the camera.

In the behind the scenes video, the creators revealed that the robots are programmed using a Maya plugin. The plugin allowed them to synchronize the robot’s movements along with the animation. The entire video is a complex choreographed dance – even the position of the actor was pre-programmed into Maya.

[Read more...]

humanoid Robot Kinects with its enviroment

[Malte Ahlers] from Germany, After having completed a PhD in neurobiology, decided to build a human sized humanoid robot torso. [Malte] has an interest in robotics and wanted to  show case some of his skills.The project is still in its early development but as you will see in the video he has achieved a nice build so far.

A1 consists of a Human sized torso with two arms, each with five (or six, including the gripper) axes of rotation, which have been based on the robolink joints from German company igus.de. The joints are tendon driven by stepper motors with a planetary gear head attached. Using an experimental controller which he has built, [Malte] can monitor the position of the axis by monitoring the encoders embedded in the joints.

The A1 torso features a head with two degrees of freedom, which is equipped with a Microsoft Kinect sensor and two Logitech QuickCam Pro 9000 cameras. With this functionality the head can spatially ”see” and ”hear”. The head also has speakers for voice output, which can be accompanied by an animated gesture on the LCD screen lip movements for example. The hands feature a simple gripping tool based on FESTO FinGripper finger to allow the picking up of misc items.

Robotic Manta Ray (Mantabot)

The Robotic Manta Ray codenamed MantaBot created by the Bio-Inspired Engineering Research Laboratory (BIER Lab) is set to make a splash. The next evolution in underwater Robotics is here. We have seen the likes of robotic fish and Jelly fish now to be added to the school is the MantaBot which has been designed to mimic the unique swimming motion of the Manta Ray,

This biologically inspired under water robot’s has been designed with a primary goal to be autonomous using its onboard electronics to make its own decisions to navigate its watery domain. BIER Lab has received major funding from the Department of Defense (DoD) Multi-disciplinary University Research Initiative (MURI) program. Part of its goal in the long run is to reverse engineer the biological systems of such creatures to the point of creating simulated artificial skin and muscle.

[Via dvice.com]

[Read more...]

Artificial skin lets robots feel

BioTac Artificial Skin Technology is sure to be a storm with Robotics Designers. Giving them the opportunity to add a third sense to there robotic marvels. Now they can have the sense of touch to go along with existing technologies of sight and of sound.  Thanks to the technology coming out of the University of Southern California making this possible.

They have chosen to call their sensor BioTac, which is a new type of tactile sensor designed to mimic the human fingertip with its soft flexible skin. The sensor makes it possible to identify different types of texture by analyzing the vibrations produced as the sensor brushes over materials. This sensor is also capable of measuring pressure applied and  ambient temperature around the finger tip, expect to see this technology in next gen prosthetics. Let us know your thoughts on it.

[via technabob]
[Read more...]

Boxie is an adorable toddler videographer

Meet Boxie. He’s a robot videographer with levels of interaction we haven’t seen outside an episode of Dora the Explorer. The project was conceived by [Alex] as his MIT thesis project to see if robots can use humans to make themselves more useful. All we know is Boxie is freaking adorable, as evidenced by this video.

The idea behind Boxie was inspired by Afghan Explorer to capture video in an attempt to tell a story. In the videos (after the break), Boxie wanders around the halls of MIT searching for people to help him (“can you carry me up the stairs?”) and tell stories (“what do you do here?”). It’s an experiment in autonomous documentary directorial skill that was edited down into a video that made sense.

[Alex] designed Boxie to be the cutest thing we’ve ever seen so he could elicit a response from the subjects of the documentary. We’re going to say the voice helped, but [Alex] also found the cardboard robot factor also played into the success. Boxie was originally planned to have a plastic skin, but [Alex]‘s friends thought it looked really creepy. They suggested that [Alex] go back to the prototype cardboard body. All we know is there’s a robot cuter than a Keepon, finally.

[Read more...]

Analog robotic concepts

Everyone’s getting on board with the 555 timer projects. But [Tom] didn’t just come up with one project, he shared a slew of ideas related to analog robotics. They’re center around servo motor control. You can see in the video after the break he has a pleasing way of sharing a lot of details while also making an easy to view demonstration video. He’ll put up a schematic for about one second and then move on, saving those that don’t care about the details by not droning on.

The first schematic that flashes by is the main circuit for controlling the servo motor. The rest of the concepts build from this circuit, using light, sound, flex, and other sensors as inputs. For instance, the setup above is using a light sensor. When the ball blocks the light the servo moves that vertical rod hitting it out of the way. When it swings back the process repeats. It’s striking how lifelike the reactions are, reminding us of insect movements. But this is really just the tip of the iceberg as he’s got a lot of future video ideas that we can’t wait to see.

[Read more...]

Low-cost video chat robot

remote_control_chat_robot

[Johnny Chung Lee], having recently moved from Seattle to Mountain View, wanted a way to keep in touch with his fiancé who would not be relocating for several more months. While most of us would likely consider purchasing a pair of web cams to keep in touch, he decided to do things his own way.  Using an iRobot Create and a netbook, both about $250 apiece, he constructed a remote-controlled video chat robot that he can steer around his former abode from 1,000 miles away. While $500 might seem expensive at first, [Johnny] reminds us that commercial versions likely run into the thousands of dollars.

The whole setup is controlled using custom software to manage the movement of the robot, which can be used in conjunction with freely available videoconferencing applications, such as Skype. He also modified the iRobot’s charging station to charge both the robot and the netbook simultaneously – a process he explains, but precedes with several disclaimers.  Like some of his previous projects we have covered, he has made the C# source used in this project available for download on his site, along with documentation for both the control software and dock modifications.

Check out video of the robot in action after the jump.

[Read more...]