Google recently updated their Google Mobile App with a couple new features. Voice Search automatically starts listening when you raise the phone to your ear. Just say what you’re looking for, and it will poll Google and return the results. The app leverages Google’s voice recognition engine, which they’ve been training with Goog-411. [Andy Baio] has been experimenting with audio transcription and was curious what the new app was doing behind the scenes. He started by sniffing the packets as they traversed his network. Unfortunately, the size of the data packets transmitted is so small that he’s almost certain he’s missing something. He’d appreciate any help in this endeavor. Part of the problem might be Google getting special treatment and using undocumented iPhone SDK features.
4 thoughts on “Reversing Google’s IPhone Voice Search”
Leave a Reply
Please be kind and respectful to help make the comments section excellent. (Comment Policy)
wow, great find. the things google could use that data set research for is….yeah
Post1, not sure it would be any different than their search engine.
@ Mr. Phillips, Well written, but of course he’s missing the transcription in the packets. Doesn’t transcription happen client side in the app?
The app uses the MIC to record, then it ‘decodes’ the sounds into phonemes which it submits. Then google gives a binary response which correlates to the final search.
So basically, the guy knows nothing about transcription and is asking (someone else) to do all the work.
So you’re cell phone can listen to your voice and relay your words to over the net in a machine readable form?
anyone else think this is kind of creepy and bigbrotherish?
It will compete with iPhone and Blackberry in the future. Users will have more choices,….good.