[Will Powell] sent in his real-time subtitle glasses project. Inspired by the ever cool Google Project Glass, he decided he would experiment with his own version.
He used two Raspberry Pi’s running Debian squeeze, vuzix glasses, microphones, a tv, ipad, and iphone as the hardware components. The flow of data is kind of strange in this project. The audio first gets picked up by a bluetooth microphone and streamed through a smart device to a server on the network. Once it’s on the server it gets parsed through Microsoft’s translation API. After that the translated message is sent back to a Raspberry Pi where it’s displayed as subtitles on the glasses.
Of course this is far from a universal translation device as seen in Star Trek. The person being translated has to talk clearly into a microphone, and there is a huge layer of complexity. Though, as far as tech demos go it is pretty cool and you can see him playing a game of chess using the system after the break.