Signals Intelligence (SigInt) isn’t something that you normally associate with home hackers, but the Deep Sweep project is looking to change that: it is a balloon platform that captures radio signals in the stratosphere, particularly conversations between drones and satellites. Created by three students at the Frank Ratchye Studio for Creative Inquiry at Carnegie-Mellon, Deep Sweep is a platform that is attached to a balloon and which captures signals over a wide range of frequencies, logging them for later analysis. The current version captures data on three frequency bands: LF/HF (10KHz-30KHz), UHF (650 – 1650MHz) and SHF (10-20GHz). The latter are often the bands used for satellite links between drones and satellites. They are difficult to intercept from the ground, as the signals are directed upwards towards the satellite. By creating a platform that can fly several kilometers above the earth, they are hoping to be able to capture some of this elusive traffic.
So far, the team has made two flights in Europe, both of which encountered technical issues. The first had a battery fault and only captured 10 minutes of data, and the second flew further than expected and ended up in Belarus, a country that isn’t likely to welcome this kind of thing. Fortunately, they were able to recover the balloon and are working on future launches in Europe and the USA. It will be interesting to see how the Department of Homeland Security feels about this.
The DARPA robotics challenge trials 2013 are have finished up. The big winner is Team Schaft, seen above preparing to drive in the vehicle trial. This isn’t the end of the line for DARPA’s robotics challenge – there is still one more major event ahead. The DARPA robotics finals will be held at the end of 2014. The tasks will be similar to what we saw today, however this time the team and robot’s communications will be intentionally degraded to simulate real world disaster situations. The teams today were competing for DARPA funding. Each of the top eight teams is eligible for, up to $1 million USD from DARPA. The teams not making the cut are still welcome to compete in the finals using other sources of funding.
The trials were broken up into 8 events. Door, Debris, Valve, Wall, Hose, Terrain, Ladder, and Vehicle. Each trial was further divided into 3 parts, each with one point available. If a robot completed the entire task with no human intervention it would earn a bonus point. With all bonuses, 32 points were available. Team Schaft won the event with an incredible total of 27 points. In second place was Team IHMC (Institute for Human Machine Cognition) with 20 points. Team IMHC deserves special praise as they were using a DARPA provided Boston Dynamics Atlas Robot. Teams using Atlas only had a few short weeks to go from a completely software simulation to interacting with a real world robot. In third place was Carnegie Mellon University’s Team Tartan Rescue and their Chimp robot with 18 points.
The expo portion of the challenge was also exciting, with first responders and robotics researchers working together to understand the problems robots will face in real world disaster situations. Google’s recent acquisition — Boston Dynamics — was also on hand, running their WildCat and LS3 robots. The only real downside to the competition was the coverage provided by DARPA. The live stream left quite a bit to be desired. The majority of videos on DARPA’s YouTube channel currently consist of 9-10 hour recordings of some of the event cameras. The wrap-up videos also contain very little information on how the robots actually performed during the trials. Hopefully as the days progress, more information and video will come out. For now, please share the timestamp and a description of your favorite part with your comments.
Continue reading “DARPA Robotics Challenge Trials Wrap Up”
At Hackaday we don’t often feature kickstarter campaigns, but this one is worth noticing in our opinion. It is called Pixy, a small camera board about half the size of a business card that can detect objects that you “train” it to detect.
Training is accomplished by holding the object in front of Pixy’s lens and pressing a button. Pixy then finds objects with similar color signatures using a dedicated dual-core processor that can process images at 50 frames per second. Pixy can report its findings, which include the sizes and locations of all detected objects, through one of several interfaces: UART serial, SPI, I2C, digital or analog I/O.
The platform is open hardware, its firmware is open source and GPL licensed, making the project very interesting. It is based on a 204MHz dual core ARM cortex M4 & M0, uses a 1280×800 image sensor and can stream the processed camera output to your computer. You can get one Pixy in the kickstarter campaign for $59, which is not that expensive for what it is.