NASA would like you to help them explore — not space — but the bottom of the ocean. For now, you’ll need an Apple device, although an Android version is in the works. While it might seem strange for the space agency to look underwater, the images they need to process are from fluid-lensing cameras that use techniques originally meant to remove distortion from the atmosphere from pictures of outer space. Turns out they can also unravel distortion caused by the ocean and clearly image coral reefs.
The phone app is in the form of a game and, according to NASA, even a first grader could play it. In the game, you are in command of an ocean research vessel, the Nautilus. You dive to examine coral and identify what you see. The game generates training data for a supercomputer at the Ames Research Center so it can recognize coral types even when taken with more conventional cameras.
Coral is a fascinating life form and are under threat from a variety of sources. The supercomputer will take data from Puerto Rico, Guam, and American Samoa and learn how to do the classification. Then it will apply that learning to other photographs and build a world-wide map of coral.
The game allows you to earn badges and access video content about life under the sea. The data, by the way, didn’t come from space, but from cameras on drones or conventional aircraft.
This isn’t the first time we’ve seen crowdsourcing for scientific research. If you want to grow your own coral, that’s a hardware project.
“Click on all of the pictures with coral in them.”… NASA gets into the CAPTCHA arena. :^)
I’m getting worried about those CAPCHAs if they’re gonna be used for training self driving cars etc, they’ve now trained humans to ignore corners and stray parts of objects to actually pass the capcha first go, i.e. if something not at least 10% of the square, ignore it. So self driving cars are gonna be almost driving past things if it wasn’t for those pesky last few inches they hung up on.