[Gus] made it to the Google+ developers vlog to show off his new Google+ hangout controlled robot. This robot, named OSCAR (Overly Simplified Collaboratively Actuated Robot), drives around according to the whims of everyone in a Google+ hangout. Not only is the robot under remote control through a Google+ hangout, it also features a camera, allowing a hangout audience to explore a space in real time.
[Gus] built OSCAR out of an old Roomba he found in his parent’s basement. After attaching an Android tablet to the Roomba with some binder clips, [Gus] put a web server on the tablet and wrote a Google+ hangout extension allowing all hangout viewers to remotely control OSCAR.
Right now, all the commands received on the hangout are put into a queue, meaning everyone on a hangout has control of OSCAR. The next version will change those commands to deltas, or changes in the current state, canceling out conflicting commands. If only we had one of these while we were streaming for the Red Bull competition…
You can check out a demo of OSCAR after the break.
Continue reading “Meet OSCAR, the Google Hangout robot”
By now we’re assuming you are all familiar with Google’s “Project Glass”, an ambitious augmented reality project for which they revealed a promotional video last week. [Will Powell] saw the promo vid and was so inspired that he attempted to rig up a demo of Project Glass for himself at home.
While it might seem like a daunting project to take on, [Will] does a lot of work with Kinect-based augmented reality, so his Vuzix/HD webcam/Dragon Naturally Speaking mashup wasn’t a huge step beyond what he does at work. As you can see in the video below, the interface he implemented looks very much like the one Google showed off in their demo, responding to his voice commands in a similar fashion.
He says that the video was recorded in “real time”, though there are plenty of people who debate that claim. We’re guessing that he recorded the video stream fed into the Vuzix glasses rather than recording what was being shown in the glasses, which would make the most sense.
We’d hate to think that the video was faked, mostly because we would love to see Google encounter some healthy competition, but you can decide for yourself.
Continue reading “DIY “Project Glass” clone looks almost too good to be true”
As weird as it might sound, there’s a way to use Google documents as a web proxy. The image above is a screenshot of [Antonio] demonstrating how he can view text data from any site through the web giant’s cloud applications. Certain sites may be blocked from your location, but the big G can load whatever it wants. If all you need is the text, then so can you.
The hack takes advantage of the =IMPORTDATA() function of Google Spreadsheet. We guess the command is meant to make import of XML data possible, but hey, that’s pretty much what HTML data is too, right? But what good it the raw webpage code in a spreadsheet? This is where [Antonio] made a pretty brilliant leap in putting this one together. He authored a bookmarklet that provies a navigation interface, hides the raw code which is stored in the spreasheet, and renders it in the browser. This ties together a user supplied URL, reloading data on the hidden spreadsheet and refreshing the window as necessary. See for yourself in the clip after the break.
Continue reading “Using Google documents as a web proxy”
So you can spend a bundle on a new phone and it comes with a voice-activated digital assistant. But let’s be honest, it’s much more satisfying if you coded up this feature yourself. Here’s a guide on doing just that by combining an Asterisk server with the Wolfram Alpha API.
Asterisk is a package we are already familiar with. It’s an open source Private Branch Exchange suite that lets you build your own telephone network. Chances are, you’re not going to build one just for this project, but if you do make sure to document the process and let us know about it. With the Asterisk server in place you just need to give the assistant script an extension (in this case it’s 4747).
But then there’s the problem of translating your speech into text which can be submitted as a Wolfram query. There’s an API for that too which uses Google to do that translation. From there you can tweak abbreviations and other parameters, but all-in-all your new assistant is ready to go. Call it up and ask what to do when you have a flat tire (yeah, that commercial drives us crazy too).
There have been many self-driving cars made with different levels of success, but probably the most well-known project is the Google car. What you may not have heard of, though is the autonomous Google cart, or golf cart to be exact. The first video after the break explains the motivation behind the cart and the autonomous vehicle project. As with another autonomous vehicle we’ve featured before, they didn’t forget to include an E-stop button (at 1:03)!
In the second video (also after the break) Google’s Sebastian Thrun and Chris Urmson get into more of the details of how Google’s more famous autonomous Prius vehicles work and their travels around different towns in California. A safety driver is still used at this point, but the sensor package includes a roof-mounted 64-beam laser sensor, wheel encoder, radars, and a GPS sensor. With Google’s vast resources as well as their work with Streetview and Google maps, it’ll be interesting to see what comes of this technology. I, for one, welcome our new robotic overlords.
Continue reading “All About the Google Autonomous Vehicle Project”
After learning that Google’s ADK relied on using an Arduino-compatible board, [Benjamin] was disappointed that other microcontroller platforms weren’t invited to the party. Rather than switch camps, he took it upon himself to get the ADK working with his EvalBot. In fact, his modifications should allow the ADK to work with nearly any Stellaris ARM kit.
The hack is composed of two parts. The first, and most important bit is the USB host driver he developed to work with the ADK. The code borrows some bits from Texas Instruments, and will be published on GitHub once he gets a chance to clean up the source a bit. To get his phone working with the EvalBot, he also had tweak the external USB power supply in order to provide the current required to operate properly with other USB-connected hardware.
It’s always nice to have more options when working with Google’s ADK, and [Benjamin’s] work is likely a welcome addition to any Stellaris developers toolkit.
Continue reading to see a quick video of his EvalBot ADK demo.
Continue reading “Google ADK on an EvalBot”
[charliex] from Null Space Labs wrote in to share a project that he and the rest of the gang have been working on over the last few weeks. The team has been remixing and building clones of the Google ADK demo board we saw earlier this year, in hopes of getting a huge batch prepped before Defcon 19.
Their version makes subtle changes to the original, such as extra header rows for Mega AVRs, higher quality RGB LEDs, and a nifty pirate-Android logo. They also added the ability for the board to send and receive IR signals allowing it to be used as a TV-B-Gone, as well as in more fruitful pursuits. The Arduino board used with the ADK has only undergone minor revisions, most of which were layout related.
[charliex] hasn’t mentioned a price for their improved ADK boards, but we’re guessing they will be substantially cheaper than the official Google version. In the meantime, check out their site for a boatload of pictures and videos of these boards undergoing various stages of construction.