A Menorah For The 21st Century

For those new and experienced, this time of year is a great chance for enterprising makers to apply their skills to create unique gifts and decorations for family and friends. [Mike Diamond] of What I Made Today built a phone controlled, light-up menorah. It’s a charming way to display some home automation know-how during the holidays.

Expanding on his previous project — a pocket-sized menorah — a Raspberry Pi Zero with a WiFi dongle, some LEDs, wire, and tea lights suffice for the materials, while setting-up Blynk on the Raspberry Pi and a phone to control the lights ties it together after mounting it in an old monitor housing.

Continue reading “A Menorah For The 21st Century”

Smartphone Will Destroy You At Air Hockey

Most of us carry a spectacularly powerful computer in our pocket, which we rarely use for much more than web browsing, social media, and maybe the occasional phone call. Our mobile phones are technological miracles, but their potential sometimes seems wasted.

It’s always a pleasure to see something that makes use of a mobile phone to drive some nuts-and-bolts hardware. [Jose Julio]’s project does just that, using the phone as the brains behind a robotic air hockey table.

Readers with long memories will remember previous air hockey tables from [Jose], using 3D printer components controlled by an Arduino Mega with a webcam suspended above the field of play. This version transfers camera, machine vision, and game strategy to an Android app, leaving the Arduino to control the hardware under wireless network command from above.

The result you can see in the video below the break is an extremely fast-paced game, with the robot looking unbeatable. If you want to build your own there are full instructions and code on GitHub, or if you follow the link from the page linked above, he sells the project as a kit.

Continue reading “Smartphone Will Destroy You At Air Hockey”

Hackaday Prize Entry: LipSync, Smartphone Access For Quadriplegic People

For most of us, our touch-screen smartphones have become an indispensable accessory. Without thinking we tap and swipe our way through our digital existence, the promise of ubiquitous truly portable computing has finally been delivered.

Smartphones present a problem though to some people with physical impairments. A touchscreen requires manual dexterity on a scale we able-bodied people take for granted, but remains a useless glass slab to someone unable to use their arms.

LipSync is a project that aims to address the problem of smartphone usage for one such group, quadriplegic people. It’s a mouth-operated joystick for the phone’s on-screen cursor, with sip-and-puff vacuum control for simulating actions such as screen taps and the back button.

To the smartphone itself, the device appears as a standard Bluetooth pointing device, while at its business end the joystick and pressure sensor both interface to a Bluetooth module through an Arduino Micro. The EAGLE board and schematic files are available on the project’s hackaday.io page linked above, and there is a GitHub repository for the code.

Technology is such a part of our lives these days, and it’s great to see projects like this bridge the usability gaps for everyone.  Needless to say, it’s a perfect candidate for the Assistive Technology round of the Hackaday Prize.

 

 

Hand Waving Unlocks Door

Who doesn’t like the user interface in the movie Minority Report where [Tom Cruise] manipulates a giant computer screen by just waving his hands in front of it? [AdhamN] wanted to unlock his door with hand gestures. While it isn’t as seamless as [Tom’s] Hollywood interface, it manages to do the job. You just have to hold on to your smartphone while you gesture.

The project uses an Arduino and a servo motor to move a bolt back and forth. The gesture part requires a 1sheeld board. This is a board that interfaces to a phone and allows you to use its capabilities (in this case, the accelerometer) from your Arduino program.

The rest should be obvious. The 1sheeld reads the accelerometer data and when it sees the right gesture, it operates the servo. It would be interesting to do this with a smart watch, which would perhaps look a little less obvious.

We covered the 1sheeld board awhile back. Of course, you could also use NFC or some other sensor technology to trigger the mechanism. You can find a video that describes the 1sheeld below.

Continue reading “Hand Waving Unlocks Door”

Abusing A Cellphone Screen With Solenoids Posts High Score

This Raspberry Pi 2 with computer vision and two solenoid “fingers” was getting absurdly high scores on a mobile game as of late 2015, but only recently has [Kristian] finished fleshing the project out with detailed documentation.

Developed for a course in image analysis and computer vision, this project wasn’t really about cheating at a mobile game. It wasn’t even about a robotic interface to a smartphone screen; it was a platform for developing and demonstrating the image analysis theory he was learning, and the computer vision portion is no hack job. OpenCV was used as a foundation for accessing the camera, but none of the built-in filters are used. All of the image analysis is implemented from scratch.

The game is a simple. Humans and zombies move downward in two columns. Zombies (green) should get a screen tap but not humans. The Raspberry Pi camera takes pictures of the smartphone’s screen, to which a HSV filter is applied to filter out everything except green objects (zombies). That alone would be enough to get you some basic results, but not nearly good enough to be truly reliable and repeatable. Therefore, after picking out the green objects comes a whole chain of additional filtering. The details of that are covered on [Kristian]’s blog post, but the final report for the project (PDF) is where the real detail is.

If you’re interested mainly in seeing a machine pound out flawless victories, the video below shows everything running smoothly. The pounding sounds make it seem like the screen is taking a lot of abuse, but [Kristian] mentions that’s actually noise from the solenoids and not a product of them battling the touchscreen. This setup can be easily adapted to test out apps on different models of phones — something that has historically cost quite a bit of dough.

If you’re interested in the nitty-gritty details of the reasons and methods used for the computer vision portions, be sure to go through [Kristian]’s github repository where everything about the project lives (including the aforementioned final report.)

Continue reading “Abusing A Cellphone Screen With Solenoids Posts High Score”

Smartphone-based Robotic Rover Project Goes Open Source

[Aldric Négrier] wrote in to let us know that his DriveMyPhone project has been open sourced. The project is a part telepresence, part remote-controlled vehicle, part robotic rover concept on which he says “I spent more time […] than I should have.” He has shared not just the CAD files, but every detail including tips on assembly. He admits that maybe a robotic chassis for a smartphone might not seem like a particularly new idea today, but it was “an idea with more potential” back in 2010 when he first started.

The chassis is made to cradle a smartphone. Fire up your favorite videoconferencing software and you have a way to see where you’re going as well as hear (and speak to) your surroundings. Bluetooth communications between the phone and the chassis provides wireless control. That being said, this unit is clearly designed to be able to deal with far more challenging terrain than the average office environment, and has been designed to not only be attractive, but to be as accessible and open to repurposing and modification as possible.

Continue reading “Smartphone-based Robotic Rover Project Goes Open Source”

Smartphone And IR Line Laser Measure Distance

Measuring the distance using lasers is a mainstay of self-driving vehicles and ambitious robotics projects. The fine folks at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) decided to tackle the problem in an innovative way. [Jason H. Gao] and [Li-Shiuan Peh] used an infra-red (IR) line laser and the camera on a smartphone. Their prototype cost only $49 since they used a smartphone that was on hand. The article reports good results using the device outdoors in direct sunlight which is often a challenge for inexpensive lidars.

The line laser creates a horizontal line that is reflected back to the camera on the phone. The vertical position of the laser on the camera image lets the phone calculate the distance by parallax. To bring out a faint laser reflection, the algorithm compares four images – two with the laser on and two with it off – and subtracts the background. Using a smartphone for this is ideal since it automatically adjusts for light level and can easily be upgraded to a newer phone with a better camera later.

This should be a cheap and easily replicable setup. If you make one of these, let us know. If you need something more refined, check out this post on interfacing the Neato vacuum cleaner’s XV-11a lidar with the Raspberry Pi.

Continue reading “Smartphone And IR Line Laser Measure Distance”