Delta Robot Is Sorting Golf Balls And Taking Names

It’s a common situation faced by every hard-working American – you get home after a long day at the calcium mines, and find yourself stuck with a pile of colored golf balls that simply aren’t going to sort themselves. Finally, you can put away your sorting funnels and ball-handling gloves – [Anthony] has the solution.

That’s right – it’s a delta robot, tasked with the job of sorting golf balls by color. A Pixy2 object tracking camera is used to survey the table, with the delta arms twitching around to allow the camera to get an unobstructed view. Once the position of the balls is known, a bubble sort is run and the balls rearranged into their correct color order.

[Anthony] readily admits the bubble sort is very inefficient at this task; it was an intentional choice so it could be later compared with other sorting methods. [Anthony] also goes into detail, sharing the development process of the suction gripper as well as discussing damping methods to reduce noise.

Delta machines are always fun to watch, and are a good choice for sorting machines. We’ve seen some really tiny ones, too. Video after the break.

Continue reading “Delta Robot Is Sorting Golf Balls And Taking Names”

This Machine Teaches Sign Language

Sign language can like any language be difficult to learn if you’re not immersed in it, or at least learning from someone who is fluent. It’s not easy to know when you’re making minor mistakes or missing nuances. It’s a medium with its own unique issues when learning, so if you want to learn and don’t have access to someone who knows the language you might want to reach for the next best thing: a machine that can teach you.

This project comes from three of [Bruce Land]’s senior electrical and computer engineering students, [Alicia], [Raul], and [Kerry], as part of their final design class at Cornell University. Someone who wishes to learn the sign language alphabet slips on a glove outfitted with position sensors for each finger. A computer inside the device shows each letter’s proper sign on a screen, and then checks the sensors from the glove to ensure that the hand is in the proper position. Two letters include making a gesture as well, and the device is able to track this by use of a gyroscope and compass to ensure that the letter has been properly signed. It appears to only cover the alphabet and not a wider vocabulary, but as a proof of concept it is very effective.

The students show that it is entirely possible to learn the alphabet reliably using the machine as a teaching tool. This type of technology could be useful for other applications as well, such as gesture recognition for a human interface device. If you want to see more of these interesting and well-referenced senior design builds we’ve featured quite a few, from polygraph machines to a sonar system for a bicycle.

Continue reading “This Machine Teaches Sign Language”

Get Your Tweets Without Looking

Head-mounted displays range from cumbersome to glass-hole-ish. Smart watches have their niche, but they still take your eyes away from whatever you are doing, like driving. Voice assistants can read to you, but they require a speaker that everyone else in the car has to listen to, or a headset that blocks out important sound. Ignoring incoming messages is out of the question so the answer may be to use a different sense than vision. A joint project between Facebook Inc. and the Massachusetts Institute of Technology have a solution which uses the somatosensory reception of your forearm.

A similar idea came across our desk years ago and seemed promising, but it is hard to sell something that is more difficult than the current technique, even if it is advantageous in the long run. In 2013, a wearer had his or her back covered in vibrator motors, and it acted like the haptic version of a spectrum analyzer. Now, the vibrators have been reduced in number to fit under a sleeve by utilizing patterns. It is being developed for people with hearing or vision impairment but what drivers aren’t impaired while looking at their phones?

Patterns are what really set this version apart. Rather than relaying a discrete note on a finger, or a range of values across the back, the 39 English phenomes are given a unique sequence of vibrations which is enough to encode any word. A phenome phoneme is the smallest distinct unit of speech. The video below shows how those phonemes are translated to haptic feedback. Hopefully, we can send tweets without using our hands or mouths to upgrade to complete telepathy.

Continue reading “Get Your Tweets Without Looking”

Robot + Trumpet = Sad Trombone.mp3

[Uri Shaked] is really into Latin music. When his interest crescendoed, he bought a trumpet in order to make some energetic tunes of his own. His enthusiasm flagged a bit when he realized just how hard it is to get reliably trumpet-like sounds out of the thing, but he wasn’t about to give up altogether. Geekcon 2018 was approaching, so he thought, why not make a robot that can play the trumpet for me?

He scoured the internet and found that someone else had taken pains 20 years ago to imitate embouchure with a pair of latex lips (think rubber glove fingers filled with water). Another soul had written about measuring air flow with regard to brass instruments. Armed with this info, [Uri] and partners [Ariella] and [Avi] spent a few hours messing around with air pumps, latex, and water and came up with a proof of concept that sounds like—and [Uri]’s description is spot-on—a broken robotic didgeridoo. It worked, but the sound was choppy.

Fast forward to Geekcon. In a flash of brilliance, [Avi] thought to add capacitance to the equation. He suggested that they use a plastic box as a buffer for air, and it worked. [Ariella] 3D printed some fingers to actuate the valves, but the team ultimately ended up with wooden fingers driven by servos. The robo-trumpet setup lasted just long enough to get a video, and then a servo promptly burned out. Wah wahhhh. Purse your lips and check it out after the break.

If [Uri] ever gets fed up with the thing, he could always turn it into a game controller a la Trumpet Hero.

Continue reading “Robot + Trumpet = Sad Trombone.mp3”

Mechatronic Hand Mimics Human Anatomy To Achieve Dexterity

Behold the wondrous complexity of the human hand. Twenty-seven bones working in concert with muscles, tendons, and ligaments extending up the forearm to produce a range of motions that gave us everything from stone tools to symphonies. Our hands are what we use to interface with the physical world on a fine level, and it’s understandable that we’d want mechanical versions of ourselves to include hands that were similarly dexterous.

That’s a tall order to fill, but this biomimetic mechatronic hand is a pretty impressive step in that direction. It’s [Will Cogley]’s third-year university design project, which he summarizes in the first video below. There are two parts to this project; the mechanical hand itself and the motion-capture glove to control it, both of which we find equally fascinating. The control glove is covered with 3D-printed sensors for each joint in the hand. He uses SMD potentiometers to measure joint angles, with some difficulty due to breakage of the solder joints; perhaps he could solve that with finer wires and better strain relief.

The hand that the glove controls is a marvel of design, like something on the end of a Hollywood android’s arm. Each finger joint is operated by a servo in the forearm pulling on cables; the joints are returned to the neutral position by springs. The hand is capable of multiple grip styles and responds fairly well to the control glove inputs, although there is some jitter in the sensors for some joints.

The second video below gives a much more detailed overview of the project and shows how [Will]’s design has evolved and where it’s going. Anthropomorphic hands are far from rare projects hereabouts, but we’d say this one has a lot going for it.

Continue reading “Mechatronic Hand Mimics Human Anatomy To Achieve Dexterity”

Maker Faire NY: Where Robots Come Out To Play

There was an unbelievable amount of stuff on display at the 2018 World Maker Faire in New York. Seriously, an unreal amount of fantastically cool creations from all corners of the hacker and maker world: from purely artistic creations to the sort of cutting edge hardware that won’t even be on the rest of the world’s radar for a year or so, and everything in between. If you’ve got a creative bone in your body, this is the place for you.

But if there was one type of creation that stood out amongst all others, a general “theme” of Maker Faire if you will, it was robotics. Little robots, big robots, flying robots, battling robots, even musical robots. Robots to delight children of all ages, and robots to stalk the darkest corners of their nightmares. There were robots for all occasions. Probably not overly surprising for an event that has a big red robot as its mascot, but still.

There were far too many robots to cover them all, but the following is a collection of a few of the more interesting robotic creations we saw on display at the event. If you’re the creator of one of the robots we didn’t get a chance to get up close and personal with in our whirlwind tour through the Flushing Meadows Corona Park, we only ask that you please don’t send it here to exact your revenge. We’re very sorry. (Just kidding, if you have a robot to show off drop a link in the comments!)

Continue reading “Maker Faire NY: Where Robots Come Out To Play”

Turn Yourself Into A Cyborg With Neural Nets

If smartwatches and tiny Bluetooth earbuds are any indications, the future is with wearable electronics. This brings up a problem: developing wearable electronics isn’t as simple as building a device that’s meant to sit on a shelf. No, wearable electronics move, they stretch, people jump, kick, punch, and sweat. If you’re prototyping wearable electronics, it might be a good idea to build a Smart Internet of Things Wearable development board. That’s exactly what [Dave] did for his Hackaday Prize entry, and it’s really, really fantastic.

[Dave]’s BodiHub is an outgrowth of his entry into last year’s Hackaday Prize. While the project might not look like much, that’s kind of the point; [Dave]’s previous projects involved shrinking thousands of dollars worth of equipment down to a tiny board that can read muscle signals. This project takes that idea a bit further by creating a board that’s wearable, has support for battery charging, and makes prototyping with wearable electronics easy.

You might be asking what you can do with a board like this. For that, [David] suggests a few projects like boxing gloves that talk to each other, or tell you how much force you’re punching something with. Alternatively, you could read body movements and synchronize a LED light show to a dance performance. It can go further than that, though, because [David] built a mesh network logistics tracking system that uses an augmented reality interface. This was actually demoed at TechCrunch Disrupt NY, and the audience was wowed. You can check out the video of that demo here.