Has anyone had done a study of how much radioactive waste is released by a coal-fired plant? I noticed once that a uranium mine was only a couple hundred miles from a coal mine. The concentration must be very low, but we burn millions of tons of that stuff.
Jim, I am not one hundred percent sure that the following is correct, but 90 percent sure is good enough for me. I don't believe any radioactive waste would come from coal-fired plants. Even if there were trace amounts of uranium or other like materials in the coal burned at the plant, the material would not become radioactive through the process of burning the coal. Radioactive waste is caused by the transfer of electrons from the uranium to another source. This transfer leaves a void in the uranium causing radioactivity. Without a system that would enable the transfer of electrons (burning coal would not cause the transfer), radioactive waste will not be produced.
Recently, interest in this topic seems to have been on the increase. I recall a financial newsletter about a company claiming to have invented a process to extract the uranium from the coal, but locating this article has been elusive. However, I've found these articles:
I was thinking of the emmitters from the decomposed granite in the coal. (alpha and gamma sources not beta). I assume concentrations would be negligible, but when millions of tons of the stuff gets burned, it has to add up.
According to the article: in China, more than 2.7 billion tons of ash sit unused ... coal needs a uranium content of at least 50 parts per million to be comparable to a low-grade uranium deposit ... three power plants located nearby each other are burning coal with a uranium content averaging 65 parts per million, and the lime content (induced in the burner for emissions control) in the leftover ash is low. Recovering 70% of the uranium in the ash at a cost of $20 to $35 a pound is economically competitive at a spot uranium price of about $42 a pound. The company expects to produce up to two million pounds of uranium in China annually after three years - enough to fuel a new reactor.
Natural radioactive elements like uranium, radium, and thorium can be present in a wide number of minerals that appear as crystals in granite from around the world. Moreover, according to the EPA, granite will have some amount of radioactivity (emissions of alpha or beta particles or gamma rays). Depending on the composition of the molten rock from which they formed. Also, certain radioactive elements in granite will decay into radon, a colorless, odorless, radioactive gas which may be released from the granite over time.
It's probably not economical then to attempt to recover uranium from granite. In contrast, the coal is burned and the mass of the ash is much less after burning. With granite, there's no economical sense in burning granite for uranium recovery.
Uranium salvaging is not really what I was trying to find out. The perceived draw back to nuclear power is the disposal of the the radioactive waste, that, and the risk of an airborne or water release of radioactivity in case of an accident. I want to know if any one had done an assessment of the amount of radioactive waste generated by coal fired plant, both in the ash storage and the airborne particles that escape the scrubbers. That way the risk can be compared directly.
Even with a major effort into renewable's, a coal/nuclear mix will still account for 60 % of our power generation. Which has the lower environmental impact do to radioactive waste?