Transforming Robot Is More Than Meets The Eye

transformer_copter

Let’s face it – building robust robots isn’t exactly easy. When designing them, builders often focus on a single method of locomotion in attempts to create a robust, reliable means of transportation. Whether it moves on the ground or in the air, there are always compromises to be made when designing a robot with the ability to travel over variable terrain. Looking to change that, researchers at the Center for Distributed Robotics have recently unveiled a robot that can travel on the ground with ease, then take to the skies in a matter of seconds.

The robot is rolls along the ground on a set of wheels mounted at either end. When it is time to fly, it pushes itself up onto one end before extending its rotors. As you can see in the video below, the transition occurs pretty quickly.

The current prototype is pretty fragile and carries quite the hefty price tag . More robust revisions are already in the works, so expect to see more in the coming months.

[Thanks Sandeep]

Continue reading “Transforming Robot Is More Than Meets The Eye”

Ollie The Socially Awkward Autonomous Blimp

[Pritika] is a user experience design student who just finished up an autonomous blimp project designed to react to voices and communicate, “his friendliness and eagerness to be noticed.”

The instructable [Pritika] posted goes through the build – a 850mAh LiPo battery powers an Arduino Pro Mini, which controls two 3.6 gram servos. While not much in the way of electronics, the real beauty behind this build is the implementation. From watching the video of Ollie interacting with people, we’re pretty sure [Pritika] met her objective of making her pet blimp friendly and unobtrusive.

With quadrocopters getting so much attention as of late, it’s interesting to see development in lighter-than-air robotics. Our back of the envelope math (which is almost certainly wrong) tells us that Ollie’s ‘body’ can lift 60 grams when filled with Helium, and double that with Hydrogen. While this isn’t much lifting capacity, it’s not inconceivable that a slightly larger blimp could have more sensors or a live video feed, especially considering the 16 gram ornithopter we covered last year.

Check out a video of Ollie after the jump.

Continue reading “Ollie The Socially Awkward Autonomous Blimp”

Stress Testing Robots…with Baseball Bats

robot_stress_test

When you are working on constructing the first Cyberdyne Systems Model 101 prototype a super-robust robotic arm, you’ve got to test it somehow, right?

You probably recognize the robot being abused in the video below, as we have talked about the construction of its hand once once before. The German Aerospace Center has been working on the DLR Hand Arm System for some time now, and are obviously really excited to show you how their design performs.

In case you are not familiar, the arm you see there uses 52 different motors, miniaturized control electronics, and a slew of synthetic tendons to behave like a human arm – only much better. The system’s joints not only provide for an incredible amount of articulation, they are specially designed to allow the unit to absorb and dissipate large amounts of energy without damaging the structure.

We think that any human would be hard pressed to retain their composure, let alone be able move their arm after suffering a blow from a baseball bat, yet the robot arm carries on just fine. It’s awesome technology indeed.

Continue reading “Stress Testing Robots…with Baseball Bats”

Don’t Hit That Switch!

switch_box

Hackaday reader [Danukeru] sent us a video featuring a box-based robot with an interesting personality. The box is fairly simple and from the outside seems to consist only of a switch and an LED. When the switch is flipped however, the box comes to life.

When the box is activated, the lid opens, and a small arm reaches out to turn the switch off. We’ve seen that plenty of times, but this one turns out to be a little different. In the video, this process seems to repeat a couple dozen times before the robot gets angry and flips out. At first we thought that the end portion of the video was done with a bit of digital trickery, but after reviewing the creator’s blog, it looks like it could be legit. It is very hard to see the box’s innards in the video, but it does house a remote control car chassis that allows it to move around and spin out, as seen below.

It’s a pretty neat project, and if you can handle reading the creator’s site via Google translate, there is plenty of picture documentation of the build process for your perusal.

Continue reading “Don’t Hit That Switch!”

A Friendly Spiderbot Named Chopsticks

chopsticks_the_spiderbot

After seeing his fair share of hexapod-style bots on the Internet, [Russell] decided he wanted to build one of his own. One of the downsides to building these robots is the cost. He often saw them constructed from laser cut parts and very expensive servos. Rather than blow hundreds upon hundreds of dollars on the bot, [Russell] decided he could a lightweight bot on the cheap using chopsticks and polymorph modeling plastic.

His octopod robot is aptly called “Chopsticks” and utilizes 28 different servos to control its motions. 24 servos are used for its legs, 3 more are reserved for head movements, while a single additional servo manipulates the robot’s mandibles. The robot’s legs and main structure are composed of chopsticks, while the polymorph is used for feet, servo mounts, and pretty much anywhere else chopsticks just wouldn’t do.

[Russell] even added a set of eye stalks to complete the spider theme, arming them with IR compound eyes for object tracking. The robot is quite interactive as you can see in the video below.

Keep reading to see a video of Chopsticks, or swing by his Let’s Make Robots site if you get a chance – he has a pretty detailed construction journal as well as plenty of videos showing his spider bot in action.

Continue reading “A Friendly Spiderbot Named Chopsticks”

Advanced Robotic Arm Control Using Kinect

kinect_teleoperation

[Ryan Lloyd], [Sandeep Dhull], and [Ruben D’Sa] wrote in to share a robotics project they have been keeping busy with lately. The three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds.

Using OpenNI alongside PrimeSense, the team started out by doing some simple skeleton tracking before working with their robotic arm. The arm has five degrees of freedom, making the task of controlling it a bit tricky. The robot has quite a few joints to play with, so the trio not only tracks shoulder, elbow, and wrist movements, but they also monitor the status of the user’s hand to actuate the robot’s gripper.

When everything was said and done, the results were pretty impressive as you can see in the video below, but the team definitely sees room for improvement. Using inverse kinematics, they plan on filtering out some of the joint tracking inaccuracies that occur when the shoulders are moved in a certain way. They also plan on using a robotic arm with even more degrees of freedom to see just how well their software can perform.

Be sure to check out their site to see more details and videos.

Continue reading “Advanced Robotic Arm Control Using Kinect”

Real-time Robotic Arm Control With Blender

robotic_arm

Last year, [Justin Dailey] was coming down the home stretch of his senior year as a Computer Engineering student and needed to build a final design project. He always wanted to construct a robotic arm, and figured that there was no better way to legitimize such a project, than to claim that it was “homework”.

While he originally wanted to control the arm with a joystick, he had been messing with Blender quite a bit leading up to his final project, and thought it would be pretty cool to let Blender do the work. He started out by testing his ability to control a single servo with Blender, then slowly increased the complexity of the project. He prototyped the arm using cardboard, and satisfied with his progress thus far, began constructing the arm out of aluminum.

Once he had all six of his servos attached to the arm’s joints and wired to his Roboduino, he got busy constructing a 3D model in Blender. Using a few Python scripts, the movements inside Blender are translated to serial data in real-time, which is relayed to the Roboduino in order to control the arm.

Check out his site if you get a chance – there’s plenty of code to be had, as well as several videos of the arm in various stages of construction and testing.