Detect Cosmic Rays With Your Smartphone Using CRAYFIS

Representation of cosmic light hitting a smartphone's screen.

[Daniel Whiteson and Michael Mulhearn], researchers at the University of California, have come up with a novel method of detecting ultra-high energy cosmic rays (UHECR) using smartphones. UHECR are defined as having energy greater than 1018eV. They are rare and very difficult to detect with current arrays. In order to examine enough air showers to detect UHECR, more surface area is needed. Current arrays, like the Pierre Auger Observatory and AGASA, cannot get much larger without dramatically increasing cost. A similar THP Quarterfinalist project is the construction of a low-cost cosmic ray observatory, where it was mentioned that more detection area is needed in order to obtain enough data to be useful.

[Daniel Whiteson and Michael Mulhearn] and colleagues noted that smartphone cameras with CMOS sensors can detect ionizing radiation, which means they also will pick up muons and high-energy photons from cosmic rays. The ubiquitous presence of smartphones makes their collective detection of air showers and UHECR an intriguing possibility. To make all this happen, [Whiteson and Mulhearn] created a smartphone app called CRAYFIS, short for Cosmic RAYs Found In Smartphones. The app turns an idle smartphone into a cosmic ray detector. When the screen goes to sleep and the camera is face-down, CRAYFIS starts taking data from the camera. If a cosmic ray hits the CMOS sensor, the image data is stored on the smartphone along with the arrival time and the phone’s geolocation. This information is uploaded to a central server via the phone’s WiFi. The user does not have to interact with the app beyond installing it. It’s worth noting that CRAYFIS will only capture when the phone is plugged in, so no worries about dead batteries.

The goal of CRAYFIS is to have a minimum of one million smartphones running the app, with a density of 1000 smartphones per square kilometer. As an incentive, anyone whose smartphone data is used in a future scientific paper will be listed as an author. There are CRAYFIS app versions for Android and iOS platforms according to the site. CRAYFIS is still in beta, so the apps aren’t publicly available. Head over to the site to join up!

[via Science]

6 thoughts on “Detect Cosmic Rays With Your Smartphone Using CRAYFIS

  1. Which is the important part of UHECR detection ?
    The global distribution of the sensor elements.
    The surface area (4 square meters)
    The number of individual sensor elements (5×10^12 pixels)

    How I guestimated my numbers:
    A pixel in a CMOS device is typical in the order of about 2 micrometers, give or take.
    And a typical phone camera is probably about 5 megapixels maybe these days (so about 4mm x 4 mm).
    So a million phones would have a total sensor area of about 4 square meters (5×10^12 pixels).

    1. The physical sensor area isn’t the same as the effective area of the detector. For Auger, for instance, the physical sensor area is only 16,000 m^2, or 0.016 km^2. The detector has 3000 km^2 area. You can do this because an individual cosmic ray has a footprint on the ground that’s kilometer-sized, and you don’t need to sample the particle front everywhere on the ground.

      It’s a neat idea to get people interested, but to detect things you need all of those phones, stationary, taking data, in a contiguous area (a lone phone by itself doesn’t do anything). I’d also imagine that they didn’t try distributing the phones in 3D – in the place where you get the highest person density (cities), people spread out in 3D, not 2D.

      That being said given how cheap you can get smartphones with a camera, it’d be interesting to see what you can do with 100 cheap smartphones (~$1K). That’d be a really interesting project for a high school, for instance.

    1. According to the paper, it’s only storing frames which contain above-threshold pixels, and of those frames, they’re only storing the pixels they’re interested in. Your phone gets hot when recording video because of the CPU/GPU load for encoding and compressing video, which is quite intensive. If the frame processing takes less CPU time than video compression, it may run cooler.

      On the other hand, a million smartphones probably aren’t a lot worse for power consumption than an equivalently sized array of cooled sensors and the computers supporting them, plus the gas burned by the field techs who are going out and servicing all of them. The smartphones have the advantage of being already hauled around by volunteers who don’t require compensation for their time, and are happy to supply the electricity to the device in exchange for participating in science. (Think SETI or Folding at Home)

  2. Sounds good. Wasn’t there an article about using soft errors in RAM (detectable using ECC) to sense cosmic rays?
    Plus RAM is encapsulated so a cosmic strike will be detected 100% of the time by sampling the ECC code and looking for telltale patterns as the ray passes through the multiple layers of memory.

Leave a Reply to TruthCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.