Here’s a hypothesis for you: radioactive decay varies over time, possibly with a yearly cycle. [Panteltje] decided to test this hypothesis, and so far has two year’s worth of data to comb over.
Radioactive decay can be easily detected with a photomultiplier tube, but these tubes are sensitive to magnetic fields and cosmic rays that would easily fly through just about any shielding [Pantelje] could come up with. Instead, the radiation in this setup is detected with simple photo detectors, pressed right up against a tritium-filled glass ampoule, a somewhat common lighting solution for fishing lures, watch faces, and compasses.
The experimental setup records the photo detectors, a temperature sensor, and a voltage reference, recording all the data to an EEPROM once an hour. All the important electronics are stuffed into a heatsinked, insulated, light-proof box, while the control electronics reside on a larger board with battery backup, alarm, indicator LEDs, and an RS232 connection.
After one year, [Pantelje] recorded the data and reset the experiment for another year. There are now two years worth of data available, ready for anyone to analyze. Of course, evidence that radioactive decay changes over the course of a few years would turn just about every scientific discipline on its head, so at the very least [Panteltje] has a great record of the output of tritium lights against the expected half-life.
64 thoughts on “An Experiment To Test Radioactive Decay Varying Over Time”
Wow. Crazy soldiering job. At first I thought this was a Hackaday Fail post.
me too :D and then i wondered how it could work
Gravity is already known to effect spacetime – high gravity “slows” time. If radioactive decay rates change by a tiny but measurable amount according to how close we are to the sun, I don’t think it would turn science on its head.
Supposing the slight change in gravity caused by the Earth moving toward the sun does affect time in any measurable way, it would slow all time locally. If your point of reference is on the sun, then yes, radioactive decay would seem to slow down during that period.
However, since the sensor and the sample are in the same physical location, if time– and radioactive decay– slowed down, then the clocks in the senors would slow down exactly the same amount, so we would never detect a decrease in decay.
Isn’t our standard of time based on the theory that radioactive decay is not variable over time? A whole lotta physical measurements that physical constants are based on are time derived.
This is not correct, the standard of time is actually defined based on the subtle properties of the *electrons* of an atom (cesium, currently), not the nucleus. If any of the cesium atoms were to decay (which is exceedingly unlikely) it would no longer contain the properties measured by an atomic clock and thus would not affect the measurement.
We (as in mankind) have studied radioactive decay for over a century, nowhere in that time was this fenomenon observed, radioactive decay over time is a log. function… as stated above it will vary with time relativity (that has been proven in the LHC), but since both the detector and sample are in the same place moving at the same speed, they both observe the same passage of time, so decay is not variable…
p.s. atomic clocks do not use radioactive decay, but an electronic transition frequency as the standard, from which everything is derived…the fact Caesium 135 is the preferred element has nothing to do with radioactivity of it’s isotopes;-)
Actually, it has been observed numerous times. However, there are always more plausible seasonal effects which could have affected the instrumentation.
RandyKC: As it happens, I’m doing experiments ATM looking for a similar effect.
The problem stems from the difference between reality and the model we use to predict reality. You can use the model to predict an outcome, but unless you actually do the experiment you can’t be sure what the outcome will be.
In the case of this experiment, the decay times for isotopes were measured in the 1950’s with sporadic technique (ie – four hours a day for 30 days, except days when the researcher had to go home early), questionable equipment by modern standards (ie – a wooden disk that could be turned to face different isotopes to the sensor), and was measuring an inherently random process.
A yearly cycle can clearly be seen in the data from that time. It was noted by the researchers but never explained. If this can be reproduced and systemic errors eliminated, it would be *very* interesting.
Since this was noticed a couple of years ago, experiments have tried unsuccessfully to reproduce the effect. There is probably no real effect there – most likely it’s a systemic error .
…but we do the experiment just to be sure.
You need to watch this video to learn how atomic clocks work:
I been skool’d.
Velocity slows time as well. Given the vector of direction the Sun is traveling, I bet that there are fewer decays when the Earth’s orbit is the same direction the Sun is moving, ie. we are traveling in a faster frame of reference, than on the “downhill” side of the orbit.
Now off to look at his data.
Pondering this more, not sure if this make sense. We’re in the same reference frame as the tritium, so it should decay as expected regardless of our point in orbit.
Doesn’t work that way – an observer can never observe the effects of time dilation on themselves, only on other reference frames. You will never see a clock change speed due to time dilation if you’re riding in the same ship, no matter how close to the speed of light you get.
Of course, you will also never see yourself getting any closer to the speed of light. Light will always move away from you at C.
As totally bizarre as it seems, a clock it the front of an accelerating ship will run faster than a clock in the rear – as seen in the ship. Yes. Solid ship, same acceleration, different rate of time. A clock dragged along on a rope, even slower. Far enough back and time stops. It will twist your melon, but this is what happens when you go beyond the Cosmos and other intro versions of relativity.
Call the cops!
Are there any issues with EEPROM and radiation?
According to wikipedia, beta decay from tritium is at such low energy levels that it can only go through 6mm of open air, and can’t even penetrate the outer layer of your skin. It probably can’t penetrate through the enclosure into a IC package. Even if it could, there’s probably not enough energy left to affect anything.
The setup does not detect a beta particle but the photon that is released with each decay or interactions between the beta particle and whatever medium the tritium is in.
the vial is most likely coated with a material that phosphorescence, this experiment is measuring the rate of phosphorescence decay verses time (which is related to radio active decay and the degradation of the phosphor compound)
The betas don’t do much, but the emitted X-rays/gammas (bremsstrahlung due to interactions with the vial and enclosure) can definitely affect something like an EEPROM. It only takes a couple of photons here and there…
Perf boards are for wimps!
How can this guy be so crazy to believe his detectors will not change sensibility.
What do you believe the change would look like? What would make it cycle once per year?
and how did he make sure that the environmental temperature does not affect the output of his sensor+amplification. That can easily cause yearly rhythms.
If you read the article, there is a heating element attached to the tritium tube to keep everything at an approximately constant temperature.
I applaud his will and means, but it might be a good idea to log for another year without the tritium source (maybe substituting a dummy LED source?) to make sure any cyclic effects aren’t originating in the electronics themselves.
LEDs emission is not constant over time. Less light as they age.
The experiment =should= already be able to handle a decay in the light source over time. The effect of interest would be any cyclic variance in output.
Radioactive decay varies based on solar flare activity. He’ll need to do the test for more than 11 years. If I remember correctly the decay rate changes BEFORE a solar flare occurs. So he has built a solar flare predictor.
That seemed strange to me, and as far as I can see, the results from that experiment haven’t been reproduced. See eg http://arxiv.org/abs/1302.0970 for a try.
That is Carbon 14 and not the decay rate stays the same it is the production rate that changes.
There is only two years worth of data, therefore only two annual data points making any conclusion on whether there are any changes annually fairly shaky, more data is required. Ideally he should have just left it running continuously for 2 years, resetting the electronics affected his data at the beginning of each run, from a casual analysis in excel would say that the data shows the characteristic exponential decay which does not differ significantly from year 1 to 2
Interesting idea, extra neutron density when closer to the sun during the summer could in theory increase the decay rate. Probably not by very much considering how tiny the core of atoms are [1.6–1.7 Femtometre in diameter (10^-15)] relative to their size [64-450 Picometre in diameter (10^−12)].
Am working on a similar device using 40K, but mine uses the B+ decay channel.
Fluctuations in proportion of the B- and B+ decays would yield the same count but different spectra, which *could* vary enough to be detected.
I made a graph of the mean light output from the two detectors over the 2 years. http://imgur.com/So6Bhtw Further analysis is clearly needed to extract more useful data though.
Yup, that’s exactly what my analysis came up with. There is no correlation between either of the light channels and reference and temperature (meaning that the changes in either of those values didn’t affect the measurement) and a very strong correlation between light1 and light2. The decay appears very linear, likely because we’re nowhere near the half life in the plot, and doesn’t appear to show any cyclic nature.
Here’s a link to a background article covering the phenomena; “The strange case of solar flares and radioactive elements” http://news.stanford.edu/news/2010/august/sun-082310.html.
“Of course, evidence that radioactive decay changes over the course of a few years would turn just about every scientific discipline on its head,…”
Just curious… is Panteltje trying to somehow indirectly show how Dr. Willard Libby’s invention in the 1949, Carbon-14 Dating (or Radiocarbon dating), is highly unreliable for multi-million year dating of artifacts such as fossils, etc.? If so, then it’s a waste of time, as Dr. Libby already admitted that his invention is probably only reliable for only a few thousand years. And dating beyond that limitation done by radiometric methods is just wishful thinking and basically scientific fallacy. The only reliable dating system IMO is Dendrochronology and ancient eye-witness testimony via genealogical records. Other than that it’s just basic scientific guesswork. We should not be stating results as if they were “cast in stone” (i.e. infallible) and preached to the masses via television and media as if they were some sort of “given” fact when they are obviously not “facts” at all.
Mainstream academia will NEVER allow any “turning on heads” as any inconvenient truth about their vaunted so-called facts will be summarily dealt with and squelched. Squelching truth is a lot easier than heads turning on themselves. It’s as old as when they did it to Galileo in 17th century when he tried to turn science on it’s head. How’d that work out for him?
You are an idiot. I know of no real scientists or engineers who would try to hide this result if it is true. I think you have been spending to much time watching the rather fake attempts by the anti science establishments to discredit real work and then applying what you see to your own perceptions. If this is true and can be verified then the scientific principles that build modern life will need to be reviewed. That is supremely exciting and would in no way invalidate anything that we have already learned. Learn a little but before trying to lay some grand conspericy theory at the feet of those who work in the field.
There’s little point in using reason against believers in such dating and timelines. They are not scientists but rather preachers of a religion pretending to be science. They willfully ignore two of the core principles of science, which I believe are officially defined as “Pics or it didn’t happen” and “Bet you can’t do that again.”
Cue vicious but futile attempts at retort. Haters gonna hate.
I always hate it when I have to respond to you. You’re an idiot, and worse yet you’re an asshole.
Unfortunately, there is little point in arguing about science with people who’ve decided that A) they know more than people who have spent their lives studying the topic and B) that they are horribly oppressed by these scholars.
Maybe, if you work hard and study, you’ll be able to figure out why you have so many haters. But given your inability to use basic logic I doubt it.
At the very lease, we are at least trying to come up with answers to relevant questions, while lazy dogmatic scum do nothing, point to James Usshers dubious calculations or ignore the importance of the question completely.
Knowing the real history of this planet, not the mythological bullshit, is quite important. Understanding chronology of human history and geology are some of the major instruments we have to ensure we have at least a chance of survival in the future.
You may want to remain painfully and wilfully ignorant on the details of the complex web of data and evidence that show correlation between a myriad of methods, rendering their supposed error into deep insignificance, but that is irrelevant for you. No matter how wrong you think we are, you’d better be hoping that we are in fact very much right, because I don’t see the answers for the future coming from slobs like you, but you need those answers to be accurate just as bad as we do.
~ An archaeologist at a U with an isotopic research centre. (if anybody wishes to make the fallacy of argument from authority)
I’m actually on a roll today. So I’m going to respond to you again. This time about Galileo. I’m curious if you have studied what actually happened in his case. I would be happy to hear how an idiots view (and I am using idiot knowing the definition) can describe how his situation occurred. I may have a limited view of the renaissance and scientific reason (but I doubt it since I actually have studied it extensively).
And while you are describing your view of Galileo, please explain to us why you are trolling a science based site? I realize I am just feeding you, but like I said above I’m on a roll today.
So once you are finished with your little mental masturbation exercise., kindly fuck off.
Carbon 14 is only used for a maximum range of about 60,000 years. Other isotopes are used for longer times. Uranium-thorium dating, for example, is good for around 500000 years before it becomes too uncertain.
Yep! [sonofthunderboanerges]: Not a scientist… Let me tell you as someone who makes a living as a scientist: Our field is SO COMPETITIVE that all these “conspiracies” would never be able to exist in the real world. We are all trying to tap into the limited resources available to the scientific community to do our research and make a living. If I came across something that would shake the foundations of science, I would publish, go to meetings and tell everyone about it and apply for grant applications to every institution and foundation on the planet!!! And you know what? My grant applications would be approved because it is easier to be funded for something really exciting than for the same banality everyone else applies for… I would be rich and famous! I would be standing in line for a Noble prize!!!
The catch is: you have to prove your “science foundation shaking” hypothesis still stands after it is tested with real science (double blind tests, sufficient sampling and rigorous statistical testing, mathematical modeling and prediction power, etc) and peer review by the rest of the scientific community. Trust me, scientists are not some secretive Masonic lodge (nothing against Masons, it is just almost a synonym to “secretive”) conspiring to keep humanity believing some absurd things for some even more absurd reason… No, they are even worse! They are funding hungry vultures that will do anything to get their claws on some money to do their research! If this includes ripping apart established ideas, they’ll do it happily… What they will not do, is gather in secret at a University’s basement (sorry, no dungeons on most institutions) to conspire on how to fool the world. Sorry, to busy and too competitive for this…
About time some of Jan stuff made to HAD. He’s an active contributor at sci.electronics.* groups (you know… that obsolete and wonderful thing called Usenet). Whether you share his views or not, his posts are always interesting.
Don’t know Usenet? Get a newsreader and jump in!
I don’t know if people should get their first introduction to usenet through the sci.electronics.design newsgroup. That group is full of old farts who spend their day bickering about politics.
That is absolutely true unfortunately, but most of them are also very competent people. If you filter out the nonsense the remaining content is still orders of magnitude better than a lot of technical blogs out there.
We may need some information about the accuracy of this instrument.
Runner up, Jim Williams dead-bug wiring prize.
No control = fail.
Wrong thread = fail too.
I’m going to go wind up my alarm clock to wake me up on time in the morning, and set it under the bedside lamp for a few minutes so that the hands will glow in the dark.
I say good for him. Quite an effort and will learn a lot. I hope one thing is that with a photomultiplier (apparently rejected as a detector) you can get the energy of the particle by the light produced in a crystal or scintillation liquid as used by biologists (they call them fluors), etc. In fact, mix the tritium with the fluor in a silvered bottle and a half inch photomultiplier at one end. Perfect! If you can get the energy, you can reject all the cosmic and background radiation.
Also, maybe he will give a theory that is being tested. I mean a real theory, with a reason and some math or some stuff about the thing at the place. You know. Honestly, I thought this was a late April Fool item, but so much work! And I always love to see an op-amp used well.
One gotcha with certain sensors, they are sensitive to moisture.
CdS are particularly bad for this and in fact you can certainly detect a trend of the resistance changing with time IF they are kept in a damp environment.
The author should be nominated for the HaD award IMHO, just doing the experiment whether it worked or not at least generated useful data.
Further analysis of multiple geographically separates sensors using different materials would be fascinating and may earn someone a Ph.D someday.
lol the soldering :D best I’ve seen in a long time.
Without understanding exactly how the data was generated, I dumped the data that was presented into excel to see what could be seen.
The first years data showed an UPWARD trend for about the first 500 samples – you’ll have to explain why that happened, and how that effect was compensated. Mind you, this is not some small, hard-to-see bump.
The remainder of the data was effectively a straight line. The difference between a simple linear line (form of mX+b) and the trend line was on the order of 10^-8, or, 10 PPB (Parts Per Billion). An exponential decay over 1/12 half-life would show a far greater deviation from a straight line than this.
I don’t know what you’re measuring, but I’m fairly certain it’s not radioactive decay.
A really good experiment would have duplicated the detector, sharing as much as possible between the two, except that one would not have a Tritium light source.
Purdue published an article on this years ago and has been studying this ever since. I see no baseline reference for the data, showing that any variance in measurement is not the result of some other varlable…
So this looks like a colossal waste of time?
Why would you not use two detectors, one without a radioactive sample and one with????
Was there any point to this 2-year project other than to show how *not* to conduct a scientific experiment?
Google “long term experiment”.
Also it is worth mentioning that the decay in brightness is possibly phosphor decay as these tritium lamps use an aqueous source (iirc THO) aka doubly heavy water and the phosphor thus decays far faster than you would expect.
The increase *might* be due to water absorption leading to a short term increase close to manufacture, when this equilibriates the usual inverse square + water damage begins.
Please be kind and respectful to help make the comments section excellent. (Comment Policy)