There’s a car race going on right now, but it’s not on any sort of race track. There’s a number of companies vying to get their prototype on the road first. [Anurag] has already completed the task, however, except his car and road are functional models.
While his car isn’t quite as involved as the Google self driving car, and it doesn’t have to deal with pedestrians and other active obstacles, it does use a computer and various sensors to make decisions about how to drive. A Raspberry Pi 2 takes the wheel in this build, taking input from a Pi camera and an ultrasonic distance sensor. The Pi communicates to another computer over WiFi, where a neural network operates to make decisions about how to drive the car. It also makes decisions based on a database of pictures of the track, so it has a point of reference to go by.
The video of the car in action is worth a look. It’s not perfect, but it’s quite an accomplishment for this type of project. The possibility that self-driving car models could drive around model sets like model railroad hobbyists create is intriguing. Of course, this isn’t [Anurag]’s first lap around the block. He’s already been featured for building a car that can drive based on hand gestures. We’re looking forward to when he can collide with model busses.
Continue reading “Self-Driving Cars Get Tiny”
We all know that guy (or, in some cases, we are that guy) that can listen to a car running and say something like, “Yep. Needs a lifter adjustment.” A startup company named Augury aims to replace that skill with an iPhone app.
Aimed at commercial installations, a technician places a magnetic sensor to the body of the machine in question. The sensor connects to a custom box called an Auguscope that collects vibration and ultrasonic data and forwards it via the iPhone to a back end server for analysis. Moving the sensor can even allow the back end to determine the location of the fault in some cases. The comparison data the back end uses includes reference data on similar machines as well as historical data about the machine in question.
Continue reading “Listen Up: iPhone Hack Diagnoses HVAC”
[Jesse Burstyn] and some colleagues at Queen’s University and Carleton University (both in Canada) are delivering a paper at the INTERACT 2015 about PrintPut, their system for printing sensors directly into 3D printed objects. Using a printer with dual extrusion and conductive ABS filament, they have successfully formed capacitive touch sensors, digital resistive sensors, and analog resistive sensors.
In practice, this means they can print buttons, sliders, and even touch pads directly into objects. They also have a design for several pressure sensors and a flex sensor. The system includes scripts for the Rhinoceros 3D CAD package. Designers can create a model in any CAD package they want (including Rhinoceros) and then use these scripts to define the interactive areas.
Continue reading “Buttons, Sliders, and Touchpads All 3D Printed with PrintPut”
One of our chief complaints about the Raspberry Pi is it doesn’t have a lot of I/O. There are plenty of add ons, of course to expand the I/O capabilities. The actual Raspberry Pi foundation recently created the Sense Hat which adds a lot of features to a Pi, although they might not be the ones we would have picked. The boards were made for the AstroPi project (the project that allowed UK schools to run experiments in space). They don’t appear to be officially for sale to the public yet, but according to their site, they will be selling them soon. Update: Despite some pages on the Raspberry Pi site saying they aren’t out yet, they apparently are.
Continue reading “Sense Hat Lights up Pi”
Imagine you’re a farmer trying to grow a crop under drought conditions. Up-to-the-minute data on soil moisture can help you to decide where and when to irrigate, which directly affects your crop yield and your bottom line. More sensors would mean more data and a better spatial picture of conditions, but the cost of wired soil sensors would be crippling. Wireless sensors that tap into GSM or some sort of mesh network would be better, but each sensor would still need power, and maintenance costs would quickly mount. But what if you could deploy a vast number of cheap RFID-linked sensors in your fields? And what if an autonomous vehicle could be tasked with the job of polling the sensors and reporting the data? That’s one scenario imagined in a recent scholarly paper about a mobile Internet of Things (PDF link).
In the paper, authors [Jennifer Wang], [Erik Schluntz], [Brian Otis], and [Travis Deyle] put a commercially available quadcopter and RC car to the hack. Both platforms were fitted with telemetry radios, GPS, and an off-the-shelf RFID tag reader and antenna. For their sensor array, they selected passive UHF RFID tags coupled to a number of different sensors, including a resistance sensor used to measure soil moisture. A ground-control system was developed that allowed both the quad and the car to maneuver to waypoints under GPS guidance to poll sensors and report back.
Beyond agriculture, the possibilities for an IoT based on cheap sensors and autonomous vehicles to poll them are limitless. The authors rightly point out the challenges of building out a commercial system based on these principles, but by starting with COTS components and striving to keep installed costs to a minimum, we think they’ve done a great proof of concept here.
A team of Cornell students have designed and built their own electronic boxing trainer system. The product of their work is a game similar to Whack-A-Mole. There are five square pads organized roughly into the shape of a human torso and head. Each pad will light up based on a pre-programmed pattern. When the pad lights up, it’s the player’s job to punch it! The game keeps track of the player’s accuracy as well as their reaction time.
The team was trying to keep their budget under $100, which meant that off the shelf components would be too costly. To remedy this, they designed their own force sensors. The sensors are basically a sandwich of a few different materials. In the center is a 10″ by 10″ square of ESD foam. Pressed against it is a 1/2″ thick sheet of insulating foam rubber. This foam rubber sheet has 1/4″ slits cut into it, resulting in something that looks like jail bars. Sandwiching these two pieces of foam is fine aluminum window screen. Copper wire is fixed the screen using conductive glue. Finally, the whole thing is sandwiched between flattened pieces of corrugated cardboard to protect the screen.
The sensors are mounted flat against a wall. When a user punches a sensor, it compresses. This compression causes the resistance between the two pieces of aluminum screen to change. The resistance can be measured to detect a hit. The students found that if the sensor is hit harder, more surface area becomes compressed. This results in a greater change in resistance and can then be measured as a more powerful hit. Unfortunately it would need to be calibrated depending on what is hitting the sensor, since the size of the hitter can throw off calibration.
Each sensor pad is surrounded by a strip of LEDs. The LEDs light up to indicate which pad the user is supposed to hit. Everything is controlled by an ATMEGA 1284p microcontroller. This is the latest in a string of student projects to come out of Cornell. Make sure to watch the demonstration video below. Continue reading “Boxing Trainer Uses DIY Force Sensors”
By now you’ve seen almost anything Tweet. But have you seen the (French) twittering chicken coop? (Google translate link) [Hugo] had kept two chickens as part of a household-waste reduction campaign, and then afterward started work.
Even if you don’t read French, the chickens’ twitter feed basically tells the story.
The setup can take IR photographs of sleeping chickens and notify [Hugo] when it’s time to collect the eggs. Naturally, an abundance of other sensors are available. The coop can tweet based on ambient temperature, nest temperature, light level, motion sensor status, or the amount of remaining chicken feed. You can easily follow whether the two fowl are in the coop or out in the yard. It’s like Big Brother, only for birds.
The application is, frankly, ridiculous. But if you’re into home (or coop) automation, there’s a lot to be learned and the project is very well documented. [Hugo] used OpenCV for visual egg detection, and custom Python code to slightly randomize the tweets’ text. All of these details are up on his Github account.
And if you just can’t get enough chicken-coop hacks, be sure to check out this mobile chicken coop, this coop in the shape of a golden spiral, or this Bluetooth-enabled, talking chicken coop, among others. You’d think our name was Coop-a-Day.