Remember the days when the future was console cowboys running around cyberspace trying to fry each other’s brains out? MIT Media Lab remembers too. They have a class called MAS S65: Science Fiction to Science Fabrication in which students are trying to create hardware inspired by technology imagined in the works of legendary Speculative Fiction writers such as William Gibson, Neal Stephenson and many others. They happened to be at SXSW this year showing off some of the projects their students have been working on. Since we were around, we thought we should pay them a little visit. Fifteen minutes later it was clear why working at Media Lab is a dream for so many hackers/makers out there.
Jon Ferguson from Media Lab showed us a prototype of a game called Case and Molly, inspired by scenes in Neuromancer in which Case helps Molly navigate, by observing the world through vision-enhancing lenses sealed in her eye sockets. OK, they haven’t really build surgically-attached internet-connected lenses (yet.. we’re certain[Ben Krasnow] is working on it), but they have built a very cool snap-on 3D vision mechanism that attaches to the built-in iPhone camera. Add a little bit of live video streaming, a person with Oculus Rift and a game controller and you can party like it’s 1984.
Another interesting project is called “Mandala : I am building E14” and it uses data collected from a sensor network in MIT E14 in order to provide a view of the universe from the standpoint of a single building. It tries to address the old “what if buildings could talk?” question by visualizing the paths of people walking around the building and providing an overall sense of activity in different areas. It is also a pretty good demonstration of all the creepy things that are yet to be built using all the ‘connected devices’ coming our way.
It gets better. The Sensory Fiction project is a special book that comes with a vest which enhances the reader’s experience by providing stimulation that causes the reader to experience the same kind of physiological emotions as the characters in the book. The wearable that you have to put on supports a whole bunch of outputs: light, sound, temperature, pressure and vibration that can influence your heart rate. It is very easy to imagine so many potential ‘creative’ abuses of such a device.
Another Neuromancer-inspired piece, called LIMBO (Limbs In Motion By Others) allows synchronization of hand gestures between multiple ‘users’ over a network using a special electric muscle stimulation rig. The result is a sort of ‘meat puppet’ – one person’s hand being forced to match movement of the other. Devious ideas aside, it has great potential in helping paraplegic control their muscle movement using eye tracking.
Finally, a more cheerful project called BubbleSynth demonstrates an open computer vision/sound synthesis platform using physical processes as input to granular synthesis. The current installation is based on a bubble generating machine and motion tracking as a trigger for a modular synthesizer resulting in beautiful ambient sounds. The audio part of the platform is based on SuperCollider and is completely customizable. The next iteration of the project will be using movement of a species of bacteria in order to generate the music. Why struggle learning how to play an instrument? We’ll get bacteria do all the work.
Feel like building something similar? Hackaday’s current Sci-Fi contest is a perfect excuse. Need inspiration? Check out the syllabus for the MIT SciFi2SciFab class!