Rita’s Dolls Probably Live Better Than You Do

If it wasn’t for the weird Dutch-Norwegian techno you’d presumably have to listen to forever, [Gianni B.]’s doll house for his daughter, [Rita] makes living in a Barbie World seem like a worthwhile endeavor. True to modern form, it’s got LED lighting. It’s got IoT. It’s got an app and an elevator. It even has a tiny, working, miniature television.

It all started with a Christmas wish. [Rita] could no longer stand to bear the thought of her Barbie dolls living a homeless lifestyle on her floor, begging passing toys for enough monopoly money to buy a sock to sleep under. However, when [Gianni] visited the usual suspects to purchase a dollhouse he found them disappointing and expensive.

So, going with the traditional collaborating-with-Santa ruse, he and his family had the pleasure of collaborating on a dollhouse development project. Each room is lit by four ultra bright LEDs. There is an elevator that’s controlled by an H-bridge module, modified to have electronic braking. [Rita] doesn’t own a Dr. Barbie yet, so safety is paramount.

The brain of the home automation is a PIC micro with a Bluetooth module. He wrote some code for it, available here. He also went an extra step and used MIT’s scratch to make an app interface for the dollhouse. You can see it work in the video after the break. The last little hack was the TV. An old arduino, an SD Card shield, and a tiny 2.4 inch TFT combine to make what’s essentially a tiny digital picture frame.

His daughter’s are overjoyed with the elevation of their doll’s economic class and a proud father even got to show it off at a Maker Faire. Very nice!

Continue reading “Rita’s Dolls Probably Live Better Than You Do”

Interactive Dynamic Video

If a picture is worth a thousand words, a video must be worth millions. However, computers still aren’t very good at analyzing video. Machine vision software like OpenCV can do certain tasks like facial recognition quite well. But current software isn’t good at determining the physical nature of the objects being filmed. [Abe Davis, Justin G. Chen, and Fredo Durand] are members of the MIT Computer Science and Artificial Intelligence Laboratory. They’re working toward a method of determining the structure of an object based upon the object’s motion in a video.

The technique relies on vibrations which can be captured by a typical 30 or 60 Frames Per Second (fps) camera. Here’s how it works: A locked down camera is used to image an object. The object is moved due to wind, or someone banging on it, or  any other mechanical means. This movement is captured on video. The team’s software then analyzes the video to see exactly where the object moved, and how much it moved. Complex objects can have many vibration modes. The wire frame figure used in the video is a great example. The hands of the figure will vibrate more than the figure’s feet. The software uses this information to construct a rudimentary model of the object being filmed. It then allows the user to interact with the object by clicking and dragging with a mouse. Dragging the hands will produce more movement than dragging the feet.

The results aren’t perfect – they remind us of computer animated objects from just a few years ago. However, this is very promising. These aren’t textured wire frames created in 3D modeling software. The models and skeletons were created automatically using software analysis. The team’s research paper (PDF link) contains all the details of their research. Check it out, and check out the video after the break.

Continue reading “Interactive Dynamic Video”

Skin Bling: Wearable Electronics From Golden Temporary Tattoos

MIT Media Lab and Microsoft have teamed up to take wearable devices one step further — they’ve glued the devices directly to the user’s skin. DuoSkin is a temporary tattoo created with gold leaf. Metallic “Flash” temporary fashion tattoos have become quite popular recently, so this builds on the trend. What the team has done is to use them to create user interfaces for wearable electronic devices.

weeding-gold-leaf-temporary-tattooGenerally speaking, gold leaf is incredibly fragile. In this process to yield the cleanest looking leaf the gold is not actually cut. Instead, the temporary tattoo film and backer are cut on a standard desktop vinyl cutter. The gold leaf is then applied to the entire film surface. The cut film/leaf can then be “weeded” — removing the unwanted portions of film which were isolated from the rest by the cutting process — to complete the temporary tattoo. The team tested this method and found that traces 4.5 mm or more thick were resilient enough to last the entire day on your skin.

The gold leaf tattoos make excellent capacitive touch sensors. The team was able to create sliders, buttons, and even 2 dimensional diamond grids. These controls were used to move a cursor on a computer or phone screen. They were even able to create a wearable NFC tag. The gold leaf is the antenna, and the NFC chip itself is mounted on the temporary tattoo backer.

These devices all look great, but with the exception of the NFC chip, we’re not seeing the electronics driving them. Capacitive touch sensors used as a UI for a phone will have to have a Bluetooth radio and a battery somewhere. We’re that’s all hidden under the arm of the user. You can see what we’re taking about in the video after the break. That said, the tools and materials are ubiquitous and easy to work with. Take a quick read through the white paper (PDF) and you can be making your own version of this today.

Continue reading “Skin Bling: Wearable Electronics From Golden Temporary Tattoos”

SensorTape Unrolls New Sensor Deployment Possibilities

An embedded MEMS sensor might be lots of fun to play with on your first foray into the embedded world–why not deploy a whole network of them? Alas, the problem with communicating with a series of identical sensors becomes increasingly complicated as we start needing to handle the details of signal integrity and the communication protocols to handle all that data. Fortunately, [Artem], [Hsin-Liu], and [Joseph] at MIT Media Labs have made sensor deployment as easy as unraveling a strip of tape from your toolkit. They’ve developed SensorTape, an unrollable, deployable network of interconnected IMU and proximity sensors packaged in a familiar form factor of a roll of masking tape.

Possibly the most interesting technical challenge in a string of connected sensor nodes is picking a protocol that will deliver appreciable data rates with low latency. For that task the folks at MIT Media labs picked a combination of I²C and peer-to-peer serial. I²C accomodates the majority of transmissions from master to tape-node slave, but addresses are assigned dynamically over serial via inter-microcontroller communication. The net effect is a fast transfer rate of 100 KHz via I²C with a protocol initialization sequence that accommodates chains of various lengths–up to 128 units long! The full details behind the protocol are in their paper [PDF].

With a system as reconfigurable as SensorTape, new possibilities unfold with a solid framework for deploying sensors and aggregating the data. Have a look at their video after the break to get a sense of some of the use-cases that they’ve uncovered. Beyond their discoveries, there are certainly plenty others. What happens when we spin them up in the dryer, lay them under our car or on the ceiling? These were questions we may never have dreamed up because the tools just didn’t exist! Our props are out to SensorTape for giving us a tool to explore a world of sensor arrays without having to trip over ourselves in the implementation details.

via [CreativeApplications]

Continue reading “SensorTape Unrolls New Sensor Deployment Possibilities”

Super Thin ICs Are Coming

An ordinary integrated circuit is made of layers of material. Typically a layer is made from some material (like silicon dioxide, polysilicon, copper, or aluminum). Sometimes a process will modify parts of a layer (for example, using ion implantation to dope regions of silicon). Other times, some part of the layer will be cut away using a photolithography process.

Researchers at MIT have a new technique that allows super thin layers (1-3 atoms thick) and–even more importantly–enables you to use two materials in the same layer. They report that they have built all the basic components required to create a computer using the technique.

Continue reading “Super Thin ICs Are Coming”

Augmented Reality Becomes Useful, Real

The state of augmented reality is terrible. Despite everyone having handheld, portable computers with high-resolution cameras, no one has yet built ‘Minecraft with digital blocks in real life’, and the most exciting upcoming use for augmented reality is 3D Dungeons and Dragons. There are plenty of interesting things that can be done with augmented reality, the problem is someone needs to figure out what those things are. Lucky for us, the MIT Media Lab knocked it out of the park with the ability to program anything through augmented reality.

The Reality Editor is a simple idea, but one that is extraordinarily interesting. Objects all around you are marked with a design that can be easily read by a smartphone running a computer vision application. In augmented reality, these objects have buttons and dials that can be used to turn on a lamp, open a car’s window, or any other function that can be controlled over the Internet. It’s augmented reality buttons for everything.

This basic idea is simple, but by combining it by another oft-forgotten technology from the 90s, we get something really, really cool. The buttons on each of the objects can be connected together with a sort of graphical programming language. Scan a button, connect the button to a lamp, and you’re able to program the lamp with augmented reality.

The Reality Editor is already available on the Apple app store, and there are a number of examples available for people to start tinkering with this weird yet interesting means of interacting with the world. If you’ve ever wondered how we’re going to interact with the Internet of Things, there you have it. Video below.

Continue reading “Augmented Reality Becomes Useful, Real”

Using RF To See Through Walls

This is some seriously cool stuff. Researchers at MIT recently came up with a device that can “see” through walls. It can actually identify a person (or people) behind a solid object.

They call it RF-Capture and it uses radio waves to identify people. Kind of like some high tech radio-frequency sonar. Using a very complex algorithm it can reconstruct the human figure by analyzing the various reflections of the signals transmitted. It’s so accurate it can even distinguish between different people based on size and posture, and even trace a person’s handwriting in the air.

Sounds like whatever they’re doing, it’s probably blasting a lot of radiation to do it. You’d think so, but no.

Continue reading “Using RF To See Through Walls”