Automated Home Manufacturing Combining 3D Printing With Robotics

[Florian] has been hard at work creating this automated setup to queue jobs for his 3D printer in order for it to run while he’s not around. It’s not quite finished but the concept is there and he’s started doing some tests! 

He’s using the uArm, which was a highly successful kickstarter earlier this year — it’s an Arduino-compatible microcontroller driven 4-axis parallel-mechanism robot arm, based off of the industrial ABB PalletPack robot.

As soon as he got all the parts he setup a quick test to replace the build platforms using the uArm.

Continue reading “Automated Home Manufacturing Combining 3D Printing With Robotics”

Robot-Army IRL Plus A Massive Build Log

We went to “the dark room” at Maker Faire once more for an interview with [Sarah] of Robot-Army. She and [Mark], who handles software development for the project, were showing off 30 delta robots who know how to dance. Specifically they’re dancing in unison to the movements of another faire-goer. A Kinect sensor monitors those movements and translates them to matching motions from the deltabots.

You should remember seeing this project back in November. Now that the standards for this model have been worked out it was just a matter of sinking about three-weeks into assembling the army. We’re happy to see that the Kickstarter made it to 250% of the goal at the beginning of March, and with that there are even bigger plans. [Sarah] says the goal remains to fill a room with the robots and a we may even see a much larger version some day.

The interview is a bit short since the Robot-Army booth was right next to Arc Attack (hence the noise-cancelling headphones) and we had to try to get in and out between their ear-drum-shattering interruptions. But you can see a ton more about the project in this huge build log post over on Hackaday.io. Also check out the Robot-Army webpage. There’s a nice illustration of their adventures at MFBA and the foam Jolly Wrencher made it into the piece!

Meet Jimmy: An Open Source Biped Robot From Intel

Jimmy_Arm_Up

Intel’s CEO [Brian Krzanich] stopped by the Re/Code conference to announce Jimmy, the first robot from the 21st Century Robot project. The project is the brainchild of [Brian David Johnson], Intel’s resident futurist. We love the project’s manifesto:

 Robot Is: Imagined first. Easy to build. Completely open source. Fiercely social. Intentionally iterative. Filled with humanity and dreams. Thinking for her/him/itself.

Jimmy may not be all those things yet, but he definitely is exciting. For starters, he wasn’t built in some secret lab at Intel HQ. Much of Jimmy’s construction took place at Trossen Robotics, a name well known to Hackaday. [Matt] and [Andrew] at Trossen describe all the details in their video down past the break.

This version of Jimmy is a research robot, which mean’s he’s not going to come cheap. Jimmy sports an Intel i5 NUC motherboard, 20 Dynamixel servos, a 5052 aluminum frame and a host of sensors. A  4S 14.8v 4000mAh LiPo battery will power Jimmy for 30 to 60 minutes between charges, so be sure to budget for a few spare packs. The most striking aspect of Jimmy is his 3D printed shell. The 21st Century Robot Project gave him large, friendly eyes and features, which will definitely help with the social aspect of their goals.

Jimmy is all about open source. He can run two flavors of Linux: Ubuntu 14.04 LTS or a custom version of Yocto Pokey. There is a lot to be said for running and developing on the same hardware. No specialized toolchains for cross compiling, no NFS shares to move binaries around. If you need to make a change, you can plug a monitor (or launch an VNC session) and do everything with Jimmy’s on-board computer. Jimmy’s software stack is based upon the DARwIn OP platform, and a ROS port is in the works.

We’re excited about Jimmy, but at $16,000 USD, he’s a bit outside our budget. Thankfully a smaller consumer version of Jimmy will soon be available for around 1/10th the cost.

Continue reading “Meet Jimmy: An Open Source Biped Robot From Intel”

Synergizer: The Emergency Key-Turn Barbot

Synergizer: Emergency Drink Dispenser

It’s been a rough day at the office. You need a break. But by yourself? No, what you need is to be Synergized! This Barbot only works if all four keys are inserted and turned — kind of like a nuclear launch procedure — only then will it dispense four perfectly sized drinks to make your day better.

The Synergizer uses an Arduino to control a belt driven linear actuator which moves the spout from cup to cup. A series of reed switches along the length provide feedback to the system for positional control. The machine makes use of a peristaltic pump, called the Bartendro Dispenser, which pumps an exact volume of your liquid of choice into each cup. The cool thing with peristaltic pumps is they are self priming,and capable of pumping an exact volume of liquid every time.

[Nick Poole], the designer, also included a CPU fan and heat-sink paired up with a peltier plate in order to also chill the liquid as it is being pumped. To make it even more interesting, he added a four key override, so the Synergizer can only be used if all four unique keys are inserted.

Continue reading “Synergizer: The Emergency Key-Turn Barbot”

Canadian Space Robot Will Repair Itself

The video above shows an animation of what the Canadian Space Agency hopes will be the first successful self-repair of the Mobile Servicing System aboard the ISS. The mobile servicing system is basically a group of several complicated robots that can either perform complicated tasks on their own, or be combined into a larger unit to extend the dexterity of the system as a whole.

The most recent addition to the servicing system is the Special Purpose Dexterous Manipulator, otherwise known as Dextre. Dextre is somewhat reminiscent of a human torso with two enormous arms. It is just one of the Canadian Space Agency’s contributions to the station. It was installed on the station in 2008 to perform activities that would normally require space walks. Dextre’s very first official assignment was successfully completed in 2011 when the robot was used to unpack two pieces for the Kounotori 2 transfer vehicle while the human crew on board the ISS was sleeping.

Dextre is constructed in such a way that it can be grabbed by the Canadarm2 robot and moved to various work sites around the Space Station. Dextre can then operate from the maintenance site on its own while the Canadarm2 can be used for other functions. Dextre can also be operated while mounted to the end of Canadarm2, essentially combining the two robots into one bigger and more dexterous robot.

One of the more critical camera’s on the Canadarm2 has started transmitting hazy images. To fix it, the Canadarm2 will grab onto Dextre, forming a sort of “super robot”. Dextre will then be positioned in such a way that it can remove the faulty camera. The hazy camera will then be mounted to the mobile base component of the Mobile Servicing System. This will give the ISS crew a new vantage point of a less critical location. The station’s human crew will then place a new camera module in Japan’s Kibo module’s transfer airlock. Dextre will be able to reach this new camera and then mount it on the Canadarm2 to replace the original faulty unit. If successful, this mission will prove that the Mobile Servicing System has the capability to repair itself under certain conditions, opening the door for further self-repair missions in the future.

Mikey, The Robot That Charges Itself

 

mikey-the-robot

Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.

[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.

One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.

Continue reading “Mikey, The Robot That Charges Itself”

Robot Runs On 6 Legs But Never More Than 2 At A Time

Looking at this legged robot gives us the same feeling we had the first time we saw a two-wheeled balancer. At first glance it just shouldn’t work, but after a little thought it makes a lot of sense. The six-legged bot called OutRunner uses two sets of three legs to propel itself. The  footfalls are staggered to mimic how a biped runs, but mechanically it’s just spinning wheels to which the legs attach. If you have a smart enough algorithm it will not only remain upright but be steerable too.

This is a Kickstarter offering to let you can get your hands on an unassembled kit for $200. That version comes with a universal camera mount but no camera. This may not sound like a problem, but look closer and you may notice what we have: The thing is remote-controlled and can run up to 20 MPH, but there’s not footage of it running slowly. We’d wager the need to keep itself balanced equates to the need to run rather than walk. Since it’s going to get away from you very quickly you probably need a camera and a wearable display (or a chase car like in the video) to make the most out of the OutRunner. But hey, who’s complaining about that? Sounds like a ton of fun to us!

Why is it that this thing looks delightful but all of the Boston Dynamics running bots scare the crap out of us?

Continue reading “Robot Runs On 6 Legs But Never More Than 2 At A Time”