Official NFL footballs are crafted by hand by a company in Chicago called Wilson Sporting Goods. The footballs that are made there typically range from 11 to 11.5 inches in length and weigh anywhere between 14 and 15 ounces on average. Originally, animal bladders lined the outside, occasionally from the inside of a pig, giving the traditional American football the long-standing nickname of a “pigskin.” Now a days, they consist of cowhide leather or vulcanized rubber with laces that are stitched to the top adding mass. This causes the oblong spheres to be naturally lopsided. This is fixed by inserting extra weight to the opposite side of the football balancing it out. Knowing this, a clever hacker will realize that the balancing spot is a perfect place to subtly add a motion tracking transmitter like this one. Doing so makes it possible to the track not only the position of the ball on the field, but its precise location in 3D space!
Since each football is unique, variations between one ball to another exist. This means that embedding a circuit into a football only modifies the equipment slightly, which is a good thing because sports fanatics tend to be very opinionated about whether or not technology should influence the game. So long as the transmitter and loop antenna added to the air bladder doesn’t pass that threshold of about an ounce (or so) difference in weight, then the football itself really isn’t affected much.
Continue reading “Tracking Footballs with Magnetic Fields”
A few folks over at Carnegie Mellon have come up with a very simple way to do high-speed motion tracking (PDF) with little more than a flashlight. It’s called Lumitrack, and while it looks like a Wiimote on the surface, it is in reality much more accurate and precise.
The system works by projecting structured light onto two linear optical sensors. The pattern of the light is an m-sequence – basically a barcode where every subset of the m-sequence is unique. By shining this light onto a linear sensor, Lumitrack can calculate where the light is coming from, and thus the position of whatever is holding the light.
Even though the entire system consists of only an ARM microcontroller (in the form of a Maple Mini board), two linear optical sensors, and a flashlight with an m-sequence gel, it’s very accurate and very, very fast. The team is able to read the position at over 1000 frames/second, nearly the limit of what can be done with the Maple’s serial connection.
Already there are some interesting applications for this system – game controllers, including swords, flight yokes, and toy cars, and also more artistic endeavors such as a virtual can of spray paint. It’s an interesting piece of tech, and with the right parts, something any of us can build at home.
You can see the Lumitrack demo video below.
Continue reading “Extremely Precise Positional Tracking”
This guy takes a drink and so does the virtual wooden mannequin. Well, its arm takes a drink because that’s all the researchers implemented during this summer project. But the demo really makes us think that suits full of IMU boards are the next generation of motion capture. Not because this is the first time we’ve seen it (the idea has been floating around for a couple of years) but because the sensor chips have gained incredible precision while dropping to bargain basement prices. We can pretty much thank the smartphone industry for that, right?
Check out the test subject’s wrist. That’s an elastic bandage which holds the board in place. There’s another one on this upper arm that is obscured by his shirt sleeve. The two of these are enough to provide accurate position feedback in order to make the virtual model move. In this case the sensor data is streamed to a computer over Bluetooth where a Processing script maps it to the virtual model. But we’ve seen similar 9-axis sensors in projects like this BeagleBone sensor cape. It makes us think it would be easy to have an embedded system like that on the back of a suit which collects data from sensor boards all over the test subject’s body.
Oh who are we kidding? [James Cameron’s] probably already been using this for years.
Continue reading “IMU boards as next-gen motion capture suit?”
Today we have a special treat, three projects combining the “fastest selling consumer electronics device”, Kinect, and the “fastest selling indie java game that once kept us from sleeping for an entire weekend”, Minecraft!
[Sean Oczkowski] writes in to tell us about his efforts to play Minecraft with Kinect using no more than the OpenKinect Java wrapper on Ubuntu. The code was written in about 4 days with some help from Wikipedia. Using histograms to locate the player in the field of view, the script calculates the center mass of the body and defines interactions for the limb occupying that quadrant of the screen. [Sean] does an excellent job of running through the whole process as well as the decisions made along the way. The whole thing is a bit like running in place, and we can’t imagine the flailing that will occur during the inevitable creeper encounter.
Next we have [Wade McGillis] with his award winning Minecraft Kinect Controller. [Wade] provides source code and executables at his site. This version of control uses skeletal tracking data to sense the user’s gestures. This still involves holding your hands out like a zombie but it is a bit more versatile as one can pass their arms in front of their own body.
Finally [Nathan Viniconis] has been doing some very interesting work using the Kinect to import giant three dimensional models into the game world. [Nathan] then goes the extra mile and animates the figures! Check out the video below for the really impressive results. We here at Hackaday feel that this is the most appropriate use of this technology, and may begin building gigantic statues of ourselves on public servers.
Check out the the tricrafta (minefecta?) of videos after the jump!
Continue reading “Kinect + Minecraft Trifecta”
[Petar and Sylvain] are teaching this robot to flip pancakes. It starts with some kinesthetic learning; a human operator moves the robot arm to flip a pancake while the robot records the motion. Next, motion tracking is used so that the robot can improve during its learning process. It eventually gets the hang of it, as you can see after the break, but we wonder how this will work with real batter. This is a simulated pancake so the weight and amount at of force necessary to unstick it from the pan is always the same. Still, we loved the robotic pizza maker and if they get this to work it’ll earn a special place in our hearts.
Continue reading “Flipping pancakes”