Most people tend to enjoy a certain modicum of privacy. Aside from the data we all share willingly on the web in the form of forum posts, Twitter activity, etc., people generally like keeping to themselves.
What would you think then, if you found out your iPhone (or any iDevice with 3G) was tracking and logging your every movement?
That’s exactly what two researchers from the UK are claiming. They state that the phone is constantly logging your location using cell towers, placing the information into a timestamped database. That database is not encrypted, and is copied to your computer each time you sync with iTunes. Additionally, the database is copied back to your new phone should you ever replace your handset.
We understand that many iPhone apps use location awareness to enhance the user experience, and law enforcement officials should be able to pull data from your phone if necessary – we’re totally cool with that. However, when everywhere you have been is secretly logged in plaintext without any sort of notification, we get a bit wary. At the very least, Apple should consider encrypting the file.
While this data is not quite as sensitive as say your Social Security number or bank passwords, it is dangerous in the wrong hands just the same. Even a moderately skilled thief, upon finding or swiping an iPhone, could easily dump the contents and have a robust dataset showing where you live and when you leave – all the makings of a perfect home invasion.
Continue reading to see a fairly long video of the two researchers discussing their findings.
[Image courtesy of Engadget]
Continue reading “iPhone watching every breath you take, every move you make”
Earlier this week, we came across a video of an orb-based eyeball that would follow you throughout the room, based on data gathered from a Kinect sensor. Try as we might, we couldn’t find much more than the video, but it seems that the guys behind the project have spoken up in a recent blog post.
[Jon George] of The Design Studio UK explained that the person-tracking eyeball visualization was built using a PC, a Kinect, and a product called the Puffersphere, which projects a 360 degree image on the inside of a glass orb. A panoramic image is converted for use by the special lens inside the sphere by applying a filter which warps the image into a circular shape.
After the image has been created, a simple Windows app is used in conjunction with the OpenNI framework that allows the image to follow you around the room.
The only problem with this fun little project is the price of the sphere – we’re not sure what it is exactly, but rest assured it is more than we are willing to pay for such a toy. We’re thinking there has to be a way to simulate the orb’s effect to some degree using cheaper hardware. It’s possible that it could be done using a small-scale DIY version of this spherical mirror projection build, though it consists of concave half-spheres rather than full orbs.
In the meantime, take a look at these two videos of the orb in action. Don’t worry – we know you were totally thinking about the Eye of Sauron, so the second video should not disappoint.
Continue reading “People-tracking orb demo makes us want to build our own”
F.A.T. took it to the next level, combining a couple of their projects for the Cinekid festival. This contraption lets kids write their names with their eyes for printing by a robot arm. The first part is a glasses-free version of the EyeWriter, originally developed as an assistive technology. The system uses some IR LEDs to generate a reflection on your eye that a PS3 camera can pick up and use to precisely track your gaze. Just look at each key on a virtual keyboard to spell out your message. From there, a robot arm used previously in the Robotagger project prints out the name on a big sheet of paper the kids can take home. This is cool, but more importantly it’s a great way to inspire the next generation of hackers and engineers. Check out the video after the break.
Continue reading “Kids type with their eyes, robot arm prints their words”
This robot eye can move five times faster than the human eye. It’s capable of being used to follow a human gaze and, as you can see by that coin, it’s small enough to be used in pairs. When used to follow your gaze it needs a custom-made eye tracker. The thought here is that a lot can be learned about a person’s psyche by monitoring what they are focusing on. But we wonder about the augmented reality properties of a setup like this.
Imagine a pair of glasses as a heads up display. If this camera knows where you’re looking it can process the items in your gaze and overlay digital information. As with all new technology there are obvious military uses for this, but we’d be more interested in a Flickr pool type collection of people’s real-world experiences. Like subscribing to the locations of that thumb drive network in NYC and having the camera/glasses guide you to the nearest installation.
Want to see how fast this thing responds? Check out the video after the break.
Continue reading “Robot eyes look where you do”
The EyeSeeCam is a rig that attaches to your noggin and points a camera wherever your gaze falls. There’s actually four cameras involved here, one to track each eye via a reflecting piece of acrylic, one as your third eye, and finally the tracking camera above that. There are some legitimate medical uses for this type of technology, but we enjoyed seeing some of the videos that [Johannes Vockeroth] put together showing everyday activities. We’ve embedded several clips after the break including an example of reading a book while wearing the apparatus. The third eye camera provides the wide shot with close-ups of the wearer’s visual focus.
Continue reading “Head mounted camera tracks with your eyes”
As we all know, a solar panel must be exposed to the most amount of sunlight possible to reach full efficiency. A solid mount limits the amount of time that the panel is fully exposed to direct sunlight. The solution is to build a pivoting mount that automates the process of aiming at the sun.
[bwitmer] takes us through the process of building one out of some wood and old bicycle rims. He bought a pre made tracking unit to control his actuator, but we think many of you here could rig something up on your own.
Elaborating on an item previously mentioned among last weekend’s Cornell final projects list, this time with video:
For their ECE final project, [Adam Papamarcos] and [Kerran Flanagan] implemented a real-time video object tracking system centered around an ATmega644 8-bit microcontroller. Their board ingests an NTSC video camera feed, samples frames at a coarse 39×60 pixel resolution (sufficient for simple games), processes the input to recognize objects and then drives a TV output using the OSD display chip from a video camera (this chip also recognizes the horizontal and vertical sync pulses from the input video signal, which the CPU uses to synchronize the digitizing step). Pretty amazing work all around.
Sometimes clever projects online are scant on information…but as this is their final grade, they’ve left no detail to speculation. Along with a great explanation of the system and its specific challenges, there’s complete source code, schematics, a parts list, the whole nine yards. Come on, guys! You’re making the rest of us look bad… Videos after the break…
Continue reading “Human Tetris: object tracking on an 8-bit microcontroller”