How to Make Amazon Echo Control Fake WeMo Devices

[Chris] has been playing with the Amazon Echo. It’s sort of like having Siri or Google Now available as part of your home, but with built-in support for certain other home automation appliances like those from Belkin WeMo and Philips. The problem was [Chris] didn’t want to be limited to only those brands. He had other home automation gear that he felt should work with Amazon Echo, but didn’t. That’s when he came up with the clever idea to just emulate one of the supported platforms.

The WeMo devices use UPnP to perform certain functions over the network. [Chris] wanted to see how these communications actually worked, so he fired up his laptop and put his WiFi adapter into monitor mode. Then he used Wireshark to start collecting packets. He found that the device detection function starts out with the Echo searching for WeMo devices using UPnP. The device then responds to the Echo with the device’s URL using HTTP over UDP. The Echo then requests the device’s description using that HTTP URL. The description is then returned as an HTTP response.

The actual “on/off” functionality of the WeMo devices is simpler since the Echo already knows about the device. The Echo simply connects to the WeMo over the HTTP interface and issues a “SetBinaryState” command. The WeMo then obliges and returns a confirmation via HTTP.

WeMo Echo
How Echo Communicates with WeMo Devices

[Steve] was able to use this information to set up his own WeMo “virtual cloud”. Each virtual device would have its own IP address. They would also need to have a listener for UDP broadcasts as well as an HTTP listener running on the WeMo port 49153. Each virtual device would also need to be able to respond to the UPnP discovery requests and the “on/off” commands.

[Chris] used a Linux server, creating a new virtual Ethernet interface for each virtual WeMo switch. A single Python script runs the WeMo emulation, listening for the UPnP broadcast and sending a different response for each virtual device. Part of the response includes the device’s “friendly name”, which is what the Echo listens for when the user says voice commands. Since the virtual WeMo devices are free, this allows [Chris] to make multiple phrases for each device. So rather than be limited to “television”, he can also make a separate device for “TV” that performs the same function. [Chris] is also no longer limited to only specific brands of home automation gear.

There’s still a long way to go in hacking this device. There’s a lot of hardware under the hood to work with. Has anyone else gotten their hands (and bench tools) on one of these?

Virtual LCD Using Python

[Prashant Mohta] got hold of a Raspberry Pi, a 16×2 LCD display and got down to writing a simple game in Python. Pretty soon, he realized that it was cumbersome to have the Ras-Pi and LCD connected when all he wanted to do was write the code. So he wrote a simple Python module which renders the LCD on his computer display. A simple, quick, useful hack.

[Prashant]’s code relies on the use of Pygame, a set of Python modules designed for writing games. His code uses just two functions – one to define the LCD (characters and number of lines) while the other draws the characters on the screen by looking up an array. The code is just under 20 lines and available from his Github repo. It will be useful to those who are getting started on Python to help them understand some basics. Python is awesome and writing Python code is pretty simple.

This might draw some flak from the naysayers so if you’re commenting below on the merits, or not, of Python, just keep your comments civil and healthy. In the video below, unrelated to this hack, [Raymond Hettinger] talks about “What makes Python so Awesome”!

Continue reading “Virtual LCD Using Python”

Motion Sensing Water Gun Tweets Photos To Embarrass Enemies

[Ashish] is bringing office warfare to the next level with a motion sensing water gun. Not only does this water gun automatically fire when it detects motion, but it also takes a photo of the victim and publishes it on Twitter.

This hack began with the watergun. [Ashish] used a Super Soaker Thunderstorm motorized water gun. He pulled the case apart and cut one of the battery wires. he then lengthened the exposed ends and ran them out of the gun to his control circuit. He also placed a protection diode to help prevent any reverse EMF from damaging his more sensitive electronics. The new control wires run to a MOSFET on a bread board.

[Ashish] is using a Lightblue Bean board as a microcontroller. The Bean is Arduino compatible and can be programmed via low energy Bluetooth. The Bean uses an external PIR sensor to detect motion in the room. When it senses the motion, it activates the MOSFET which then turns on the water gun.

[Ashish] decided to use Node-RED and Python to link the Bean to a Twitter account. The system runs on a computer and monitor’s the Bean’s serial output. If it detects the proper command, it launches a Python script which takes a photo using a webcam. A second script will upload that photo to a Twitter account. The Node-RED server can also monitor the Twitter account for incoming direct messages. If it detects a message with the correct password, it can use the rest of the message as a command to enable or disable the gun.

Reverse Engineering how a USB Switch Switches

[Daniel] found himself with a need to connect a single USB device to two Linux servers. After searching around, he managed to find an inexpensive USB switch designed to do just that. He noticed that the product description mentioned nothing about Linux support, but he figured it couldn’t be that hard to make it work.

[Daniel] started by plugging the device into a Windows PC for testing. Windows detected the device and installed an HID driver automatically.  The next step was to install the control software on the Windows system. This provided [Daniel] with a tray icon and a “switch” function. Clicking this button disconnected the HID device from the Windows PC and connected the actual USB device on the other side of the USB switch. The second computer would now have access to the HID device instead.

[Daniel] fired up a program called SnoopyPro. This software is used to inspect USB traffic. [Daniel] noticed that a single message repeated itself until he pressed the “switch” button. At that time, a final message was sent and the HID device disconnected.

Now it was time to get cracking on Linux. [Daniel] hooked up the switch to a Linux system and configured a udev rule to ensure that it always showed up as /dev/usbswitch. He then wrote a python script to write the captured data to the usbswitch device. It was that simple. The device switched over as expected. So much for having no Linux support!

The Live Still Life

Here’s a project that brings together artist [Justus Bruns] and engineers [Rishi Bhatnagar] and [Michel Jansen] to collaborate on an interactive work of Art. The Live Still Life is a classic still life, streamed live from India to anywhere in the world. It is the first step towards the creation of an art factory, where hundreds of these works will be made, preserved and streamed.

The Live Still Life is a physical composition of fresh fruit and vegetables displayed on a table with flatware, cutlery and other still objects. This is located in a wooden box in Bangalore. Every minute a photo is taken and the image is streamed, live, accessible instantly from anywhere in the world. Les Oiseaux de Merde’s Indian curator is on call to replace the fruit the minute it starts to rot so as to maintain the integrity of the image. In this way, while the image remains the same, the fight against decay is always present. The live stream can be viewed at this link.

The hardware is quite minimal. An internet connected Raspberry Pi model B,  Raspberry Pi camera module, a desk lamp for illumination and a wooden enclosure to house it all including the artwork. Getting the camera to work was just a few lines of code in Python. Live streaming the camera pictures took quite a bit more work than they expected. The server was written using a module called Exprestify written on top of Express JS to facilitate easier RESTful functions. For something that looks straightforward, the team had to overcome several coding challenges, so if you’d like to dig in to the code, some of it is hosted on Github or you can ask [Rishi] since he still needs to clean it up quite a bit.

Hackaday Prize Entry: Python Powered Scientific Instrumentation

A common theme in The Hackaday Prize and in general is tools to make more tools. There are a lot of people out there trying to make the next Bus Pirate, and simply measuring things is the first step towards automating a house or creating the next great blinkey invention.

In what is probably the most capable measurement system in the running for this year’s Hackaday Prize, [jithin] is working on a Python Powered Scientific Instrumentation Tool. It’s a microcontroller-powered box containing just about every imaginable benchtop electronics tool, from constant current supplies, LCR meters, waveform generators, frequency counters, and a logic analyzer.

This project is stuffed to the gills with just about every electronic tool imaginable; there are programmable gain amplifiers, voltage references, DACs and constant current sources, opamps and comparators, all connected to a bunch of banana jacks. All of these components are tied up in a nifty Python framework, allowing a bunch of measurements to be taken by a single box.

If that’s not enough, [jithin] is also working on wireless extension nodes for this box to get data from multiple acquisition points where wires would be unfeasible. This feature uses a NRF24L01+ radio module; it’s more than enough bandwidth for a lot of sensors, and there’s enough space all the wireless sensors you would ever need.

The 2015 Hackaday Prize is sponsored by:

Eye-Controlled Wheelchair Advances from Talented Teenage Hackers

[Myrijam Stoetzer] and her friend [Paul Foltin], 14 and 15 years old kids from Duisburg, Germany are working on a eye movement controller wheel chair. They were inspired by the Eyewriter Project which we’ve been following for a long time. Eyewriter was built for Tony Quan a.k.a Tempt1 by his friends. In 2003, Tempt1 was diagnosed with the degenerative nerve disorder ALS  and is now fully paralyzed except for his eyes, but has been able to use the EyeWriter to continue his art.

This is their first big leap moving up from Lego Mindstorms. The eye tracker part consists of a safety glass frame, a regular webcam, and IR SMD LEDs. They removed the IR blocking filter from the webcam to make it work in all lighting conditions. The image processing is handled by an Odroid U3 – a compact, low cost ARM Quad Core SBC capable of running Ubuntu, Android, and other Linux OS systems. They initially tried the Raspberry Pi which managed to do just about 3fps, compared to 13~15fps from the Odroid. The code is written in Python and uses OpenCV libraries. They are learning Python on the go. An Arduino is used to control the motor via an H-bridge controller, and also to calibrate the eye tracker. Potentiometers connected to the Arduino’s analog ports allow adjusting the tracker to individual requirements.

The web cam video stream is filtered to obtain the pupil position, and this is compared to four presets for forward, reverse, left and right. The presets can be adjusted using the potentiometers. An enable switch, manually activated at present is used to ensure the wheel chair moves only when commanded. Their plan is to later replace this switch with tongue activation or maybe cheek muscle twitch detection.

First tests were on a small mockup robotic platform. After winning a local competition, they bought a second-hand wheel chair and started all over again. This time, they tried the Raspberry Pi 2 model B, and it was able to work at about 8~9fps. Not as well as the Odroid, but at half the cost, it seemed like a workable solution since their aim is to make it as cheap as possible. They would appreciate receiving any help to improve the performance – maybe improving their code or utilising all the four cores more efficiently. For the bigger wheelchair, they used recycled car windshield wiper motors and some relays to switch them. They also used a 3D printer to print an enclosure for the camera and wheels to help turn the wheelchair. Further details are also available on [Myrijam]’s blog. They documented their build (German, pdf) and have their sights set on the German National Science Fair. The team is working on English translation of the documentation and will release all design files and source code under a CC by NC license soon.