Sun may affect radioactive decay rates AUG 25 2010
During a search for a radioactive isotope-based random number generator, researchers discovered that radioactive decay rates, previously thought to be constant, appear to be influenced by the activity of the Sun.
On Dec 13, 2006, the sun itself provided a crucial clue, when a solar flare sent a stream of particles and radiation toward Earth. Purdue nuclear engineer Jere Jenkins, while measuring the decay rate of manganese-54, a short-lived isotope used in medical diagnostics, noticed that the rate dropped slightly during the flare, a decrease that started about a day and a half before the flare.
If this apparent relationship between flares and decay rates proves true, it could lead to a method of predicting solar flares prior to their occurrence, which could help prevent damage to satellites and electric grids, as well as save the lives of astronauts in space.
The decay-rate aberrations that Jenkins noticed occurred during the middle of the night in Indiana -- meaning that something produced by the sun had traveled all the way through the Earth to reach Jenkins' detectors. What could the flare send forth that could have such an effect?
Jenkins and Fischbach guessed that the culprits in this bit of decay-rate mischief were probably solar neutrinos, the almost massless particles famous for flying at nearly the speed of light through the physical world -- humans, rocks, oceans or planets -- with virtually no interaction with anything.
Maybe the science part of 2012 wasn't so far-fetched after all. (No, not really.)