Is That Google In Your Pants?

Google’s Project Jacquard is tackling the age old gap between controlling your electronic device and touching yourself. They are doing this by weaving conductive thread into clothing in the form of a touch pad. In partnership with Levi Strauss & Co., Google has been designing and producing touch interfaces that are meant to be used by developers however they see fit.

touch-sensitive-jeans-thumbThe approach that Project Jacquard has taken from a hardware standpoint is on point. Rather than having an end user product in mind and design completely towards that goal, the project is focused on the interface as its product. This has the added benefit of endless varieties of textile interface possibilities. As stated in the video embedded after the break, the conductive touch interface can be designed as a visibly noticeable difference in material or seamlessly woven into a garment.

As awesome as this new interface may seem there are some things to consider:

  • Can an unintentional brush with another person “sleeve dial” your boss or mother-in-law?
  • What are the implications of Google putting sensors in your jeans?
  • At what point is haptic feedback inappropriate? and do we have to pay extra for that?

We’ve covered e-textiles before from a conductive thread and thru hole components approach to electro-mechanical implementations.

Continue reading “Is That Google In Your Pants?”

Impedance Tomography Is The New X-Ray Machine

Seeing what’s going on inside a human body is pretty difficult. Unless you’re Superman and you have X-ray vision, you’ll need a large, expensive piece of medical equipment. And even then, X-rays are harmful part of the electromagnetic spectrum. Rather than using a large machine or questionable Kryptonian ionizing radiation vision, there’s another option now: electrical impedance tomography.

[Chris Harrison] and the rest of a research team at Carnegie Mellon University have come up with a way to use electrical excitation to view internal impedance cross-sections of an arm. While this doesn’t have the resolution of an X-ray or CT, there’s still a large amount of information that can be gathered from using this method. Different structures in the body, like bone, will have a different impedance than muscle or other tissues. Even flexed muscle changes its impedance from its resting state, and the team have used their sensor as proof-of-concept for hand gesture recognition.

This device is small, low power, and low-cost, and we could easily see it being the “next thing” in smart watch features. Gesture recognition at this level would open up a whole world of possibilities, especially if you don’t have to rely on any non-wearable hardware like ultrasound or LIDAR.

The Simplest Smart Glasses Concept

Google Glass kind of came and went, leaving one significant addition to the English language. Even Google itself used the term “glasshole” for people who used the product in a creepy way. We can’t decide if wearing an obviously homemade set of glasses like the ones made by [Jordan Fung] are more creepy, give you more hacker cred, or just make you look like a Borg. Maybe some combination of all of those. While the cost and complexity of developing for Google Glass was certainly a barrier for hacking on that hardware, this project is just begging for you to build your own and run with the concept.

[Jordan’s] build, called Pedosa Glass, really is pretty respectable for a self-built set up. The Arduino Nano is a bit bulky, and the three push buttons take up some room, but it doesn’t kill the ability to mount them in a glasses form-factor. An FLCoS display lets you see the output of the software which [Jordan] is still developing. Right now features include a timer and a flashlight that uses the head-mounted white LED. Not much, we admit, but enough to prove out the hardware and the whole point would be to add software you wanted.

Admittedly, it isn’t exactly like Google Glass. Although both use FLCoS displays, Pedosa Glass uses a display meant for a camera viewfinder, so you don’t really see through it. Still, there might be some practical use for a little display mounted in your field of vision. The system will improve with a better CPU that is easier to connect to the network with sensors like an accelerometer — there’s plenty of room to iterate on this project. Then again, you do have an entire second ear piece to work with if you wanted to expand the system.

Check out the video demo after the break.

Continue reading “The Simplest Smart Glasses Concept”

Disney’s Designing A Smart Watch That Knows What You’re Touching

Did you know Disney actually has a huge R&D subsidiary? It’s called Walt Disney Imagineering, and they’ve come up with some pretty interesting technology. They’re currently working on a smart watch interface called EM-Sense that uses an electromagnetic signal to detect and learn what the user is interacting with.

Basic machine learning allows the watch to learn what different devices “feel” like on an electromagnetic scale. It’s capable of detecting things you would expect, like appliances, power tools, and even electronic devices — but it’s apparently sophisticated enough to tell when you’re touching a door handle (and which one) depending on the structure and EM feedback!

They better explain the technology in the follow video, and demonstrate a use case for it where the smart watch can lead you through activities while giving you tutorials on skills you may need. Sounds like the beginning of a real-life PipBoy!

Continue reading “Disney’s Designing A Smart Watch That Knows What You’re Touching”

Development Tools Of The Prop-Making World

We’ve seen them before. The pixel-perfect Portal 2 replica, the Iron Man Arc Reactor, the Jedi Lightsaber. With the rise of shared knowledge via the internet, we can finally take a peek into a world hidden behind garage doors, basements, and commandeered coffee tables strewn with nuts, bolts, and other scraps. That world is prop-making. As fab equipment like 3D printers and laser cutters start to spill into the hands of more people, fellow DIY enthusiasts have developed effective workflows and corresponding software tools to lighten their loads. I figured I’d take a brief look at a few software tools that can open the possibilities for folks at home to don the respirator and goggles and start churning out props.

Continue reading “Development Tools Of The Prop-Making World”

Knappa Tutu: Some Dancing Required

Sometimes, you see a lamp shade and you’re just intoxicated enough to put it on your head like a hat and dance around on the table. Other times, you see the same lamp shade, and decide to wire it up with Neopixels, an accelerometer, and an Arduino and make a flowery, motion-activated light show when you wear it as a dress. Or at least that’s what we’ve heard.

[Cheng] gets full marks for the neo-IKEA name for the project and bonus points for clean execution and some nice animations to boot. The build is straightforward: build up the lamp so that it fits around your waist, zip-tie in the RGB LED strip, and connect up accelerometer and microcontroller. A tiny bit of coding later, and you’re off to the disco. It looks like a ridiculous amount of fun, and a sweet weekend build.

Continue reading “Knappa Tutu: Some Dancing Required”

Hacking Diabetes Meters, Towards An Artificial Pancreas

We’ve covered a number of diabetes-related hacks in the past, but this project sets its goals especially high. [Tim] has diabetes and needs to monitor his blood glucose levels and administer insulin accordingly. As a first step, he and a community of other diabetics have been working on Android apps to log the data when combined with a self-made Bluetooth re-transmitter.

But [Tim] is taking his project farther than previous projects we’ve seen and aiming at eventually driving an insulin pump directly from the app. (Although he’s not there yet, and user input is still required.) To that end, he’s looking into the protocols that control the dosage pumps.

We just read about [Tim] in this article in the Guardian which covers the diabetic-hacker movement from a medical perspective — the author currently runs a healthcare innovation institute and is a former British health minister, so he’s not a noob. One passage made us pause a little bit. [Tim] speaks the usual praises of tech democratization through open source and laments “If you try to commercialize [your products], you run up against all sorts of regulatory barriers.” To which the author responds, “This should ring alarm bells. Regulatory barriers are there for a reason.”

We love health hacking, and we’re sure that if we had a medical condition that could be helped by constant monitoring, that we’d absolutely want at least local smart-phone logging of the relevant data. But how far is too far? We just ran an article on the Therac-25 case study in which subtle software race conditions ended up directly killing people. We’d maybe hesitate a bit before we automated the insulin pump, but perhaps we’re just chicken.

The solution suggested by [Lord Ara Darzi] in the Guardian piece is to form collaborations between patients motivated by the DIY spirit, and the engineers (software and hardware) who would bring their expertise, and presumably a modicum of additional safety margin, to the table. We like that a lot. Why don’t we see more of that?