Can Robots Give Good Hugs?

We could all use a hug once in a while. Most people would probably say the shared warmth is nice, and the squishiness of another living, breathing meatbag is pretty comforting. Hugs even have health benefits.

But maybe you’re new in town and don’t know anyone yet, or you’ve outlived all your friends and family. Or maybe you just don’t look like the kind of person who goes for hugs, and therefore you don’t get enough embraces. Nearly everyone needs and want hugs, whether they’re great, good, or just average.

So what makes a good hug, anyway? It’s a bit like a handshake. It should be warm and dry, with a firmness appropriate to the situation. Ideally, you’re both done at the same time and things don’t get awkward. Could a robot possibly check all of these boxes? That’s the idea behind HuggieBot, the haphazardly humanoid invention of Katherine J. Kuchenbecker and team at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany (translated). User feedback helped the team get their arms around the problem.

Continue reading “Can Robots Give Good Hugs?”

MIT’s Hair-Brushing Robot Untangles Difficult Robotics Problem

Whether you care to admit it or not, hair is important to self-image, and not being able to deal with it yourself feels like a real loss of independence. To help people with limited mobility, researchers at MIT CSAIL have created a hair-brushing robot that combines a camera with force feedback and closed-loop control to adjust to any hair type from straight to curly on the fly. They achieved this by examining hair as double helices of soft fibers and developed a mathematical model to untangle them much like a human would — by working from the bottom up.

It may look like a hairbrush strapped to a robot arm, but there’s more to it than that. Before it ever starts brushing, the robot’s camera takes a picture that gets cropped down to a rectangle of pure hair data. This image is converted to grayscale, and then the program analyzes the x/y image gradients. The straighter the hair, the more edges it has in the x-direction, whereas curly hair is more evenly distributed. Finally, the program computes the ratio of straightness to curliness, and uses this number to set the pain threshold.

The brush is equipped with sensors that measure the forces being exerted on the hair and scalp as it’s being brushed, and compares this input to a baseline established by a human who used it to brush their own hair. We think it would be awesome if the robot could grasp the section of hair first so the person can’t feel the pull against their scalp, and start by brushing out the ends before brushing from the scalp down, but we admit that would be asking a lot. Maybe they could get it to respond to exclamations like ‘ow’ and ‘ouch’. Human trials are still in the works. For now, watch it gently brush out various wigs after the break.

Even though we have wavy hair that tangles quite easily, we would probably let this robot brush our hair. But this haircut robot? We’re not that brave.

Continue reading “MIT’s Hair-Brushing Robot Untangles Difficult Robotics Problem”

Hackaday Prize Entry: Fighting Dehydration One Sip At A Time

Humans don’t survive long without water, and most people walk around in a chronic state of mild dehydration even if they have access to plenty of drinking water. It’s hard to stay properly hydrated, and harder still to keep track of your intake, which is the idea behind this water-intake monitoring IoT drinking straw.

Dehydration is a particularly acute problem in the elderly, since the sense of thirst tends to diminish with age. [jflaschberger]’s Hackaday Prize entry seeks to automate the tedious and error-prone job of recording fluid intake, something that caregivers generally have to take care of by eyeballing that half-empty glass and guessing. The HydrObserve uses a tiny turbine flowmeter that can mount to a drinking straw or water bottle cap. A Hall sensor in the turbine sends flow data to a Cypress BLE SoC module, which totalizes the volume sipped and records a patient identifier. A caregiver can then scan the data from the HydrObserve at the end of the day for charting and to find out if anyone is behind on their fluids.

There are problems to solve, not least being the turbine, which doesn’t appear to be food safe. But that’s a small matter that shouldn’t stand in the way of an idea as good as this one. We’ve seen a lot of good entries in the Assistive Technology phase of the 2017 Hackaday Prize, like a walker that works on stairs or sonic glasses for the blind. There are only a couple of days left in this phase — got any bright ideas?

The Race To Develop Technology That Enhances Elder Care

It happens with every generation – we’re born, our parents care for us and nurture us, we grow up, they grow old, and then we switch roles and care for them. Soon it’ll be my turn to be the caregiver to my parents, and I recently got a preview of things to come when my mom fell and busted her ankle. That it wasn’t the classic broken hip was a relief, but even “just” a broken ankle was difficult enough to deal with. I live 40 minutes away from the ‘rents, and while that’s not too bad when the visits are just the weekly dinner at Grammy’s, the time and the miles really start to add up when the visits turn into every other day to make sure Mom’s getting around OK and Dad is eating and sleeping.

I was sorely tempted to hack some kind of solution to give myself a rudimentary telepresence, but I couldn’t think of anything that wouldn’t have either been unacceptably intrusive (think webcams) or difficult to support from an IT perspective. Mom’s pretty handy with the iPad and she Skypes with my brother and his family out in California, but beyond leveraging that I was tapped out for ideas that I could easily deploy and would deliver sufficient value beyond the support burden within the time frame of healing the ankle. Consequently, I spent a lot of time in the car this summer.

This experience got me to thinking about how intergenerational caregiving will change with the rise of pervasive technology. The bad news: we’re still going to get old, and getting old sucks. The good news is, I think technology is going to make things easier for caregivers and elders alike. We have an incredible range of technology experiences among the generations present right now, from my parents who can remember phones without dials and nights spent listening to the radio, to my daughter’s generation that is practically growing up with supercomputers in the palms of their hands. How each generation ages and how it embraces technology as a solution for age-related problems are going to be vastly different.

Continue reading “The Race To Develop Technology That Enhances Elder Care”