Nuclear can feel very scary. Dangerous per unit of energy produced? Not so much.
Feel scary? Bluice, the problems with both, radioactive waste and nuclear accidents are very very real.
So? Nothing is completely risk free in this world. Nuclear power is the safest form of energy per unit produced. I suppose we want look at scientific evidence on this forum.
Bluice,
Chernobyl alone killed 10,000 emergency workers, not the five dozen the nuclear industry would like to believe. Russia even grudgingly acknowledged 4,000 deaths officially. But the deaths go far beyond the emergency workers. Estimates range far into the hundreds of thousands of premature deaths as a direct result of the accident. The families in Pripyat all have many chronic diseases as a direct result of their exposure. The former Ukrainian parliamentarian head of the "permanent" commission for the study of the Chernobyl disaster, and an engineer on Chernobyl Unit 4, reported that his wife and children each have on average eight severe diseases as a result of their exposures.
The nuclear industry desperately wants to believe in hormesis - the idea that radiation at doses under 10 rem per year is beneficial, not detrimental. This belief is founded on several misunderstandings of basic science. First, it is based on examining the plots of excess cancer deaths versus excess radiation exposure to the population in Japan impacted by the nuclear weapons the United States detonated there. In doing that, they argue that since the error bar for the lowest dose band goes below the ordinate that the null hypothesis cannot be rejected - i.e. that radiation reduces the cancer death rate.
They ignore several factors in making this argument.
First, it is not currently possible to identify the causal origin of the cancers and cancer deaths. As a result, "excess deaths" cannot be separated from total deaths. The proper ordinate is zero cancer incidence, or the lowest cancer incidence rate observed for the whole population. Second, they exclude background radiation from the assignment of doses to the population sample. On average this is 20-30 rem of lifetime exposure. Third, they exclude error bars on the radiation dose assignment, both as an artifact of data binning, and from inherent assessment of the individual assigned doses. None of the exposed population carried dosimeters, and there has not been a detailed assessment of ingested dose.
In either of the two cases (lowest observed cancer rates or zero cancer rates), the error bars do NOT go below the ordinate and the null hypothesis is rejected.
The second and third factors blur the median across a large radiation dose range. The net effect of all three is to show a trend line indistinguishable from linear from high dose in the 100 plus rem range all the way down to background doses. The null hypothesis is rejected. There is no hormetic effect.
Next, they argue that the way hormesis works is by activating the immune system by lowering the impacts of other causes of cancer as the body now fights off what it perceives as a defect or assault. Even if true, this does not make the radiation damage harmless. Instead, it simply masks it. A counter argument could also be made that we then need to increase the benzene or other carcinogenic chemical exposure to the population for its disease lowering effect in the same way. That is of course absurd!
In point of fact the reason the nuclear experts argue for the hormetic effect and for eliminating the assessment of cumulative dose is that these things show risks to the whole population that argue strongly against the continued releases of nuclear materials and routine radiation exposures. That in turn argues against nuclear power. This is a case of putting a heavy thumb on the scales. It is wrong.
Next, the US DOE, and NRC, and likely others argue that the doses are below background, and therefor are unimportant. This fallacy has its roots in the environmental law arguments in the United States. Essentially in passing the various environmental laws it was argued that industry should not be responsible for "background" aka "natural" exposures. I.e. that they should not have to extend their cleanups of toxic wastes to cleanup the natural world. This idea later morphed with the idea that natural is good to become an idea that background doses cannot be harmful and are in fact good.
For nuclear matters, NRC and DOE count medical use of radiation and radioactive material exposures as "background". As that "background" has steadily risen with the expanded use of Computerized Tomography (CT and CAT scans), and nuclear medicine to fight cancer, the average "background" dose that they assign has now risen to between 650 millirem and 960 millirem per year. DOE and NRC disagree on the value to use. In both cases, this is far above the routine background of 100-200 millirem per year of natural background (sans radon - which is a special case, and which should not be included as a result).
Also, the nuclear industry still relies on the presumption that the energetic damage caused by radiation is the causative factor and that therefor the doses from all forms of radiation can be aggregated into a single exposure parameter for "whole body equivalent" dose. EPA maintains a detailed assessment of the dose factors for each radionuclide and each pathway of exposure. These have bearing on precisely which effects (cancers and other diseases) have been shown to be caused by each. And that limitation of "has been shown to" also belies a serious flaw in the analysis. At low dose rates it is difficult to get enough data to assign causality and correlations. The standard default is that when those cannot be definitively "shown to be..." the cause, that they do not cause harm. This too is wrong.
More over, the crude radiation dose factors used to evaluate harm are based on several fallacies. First, again from the bomb studies and the beliefs of those involved, a Dose Reduction Equivalence Factor (DREF) is assigned. They argued early on that when they looked a the slope curve for cancer deaths from Japanese bomb survivor data that "we know that" at low doses that the immune system is doing repair, and therefor, they arbitrarily cut the slope value in half by assigned a DREF of 2. There was NO valid justification for this. Slowly over time, the US EPA has been reducing the DREF on a case by case basis from 2 to 1.5 to 1. This was simply a previous use of the "hormesis" idea and application to real world data in violation of any scientific basis or principle. As late as the mid-1990s some argued at national conferences that the DREF should be raised to 50-100! rather than being eliminated. This argument was of course fallacious and based on the desire to have nuclear be assigned as not causing harms, and hence being beneficial.
Add to this that since the worker base is predominantly well paid, well fed, well cared for older white males, a series of additional biases enter. These include the so-called "healthy worker effect", where these aforesaid workers are healthier to begin with than the average population, and much healthier than the most vulnerable parts of the population. This creates a strong bias to lower observed effect rates, and in some cases to reverse them to assigning health benefits where none exist.
Add too that the cancer slope values decline with age, and that men are less susceptible to radiation induced cancers and several huge biases rear their ugly heads. Women are on average 50% more vulnerable. Teenagers are 5-10 times more vulnerable. Infants are at least 20 times more vulnerable.
Yet the safety standard for everyone is based on a DREF of 2 and the aforesaid 40-60 year old healthy white American males. For the most vulnerable parts of society, this understates their actual risk by a factor of at least 40.
Recent studies also show that both cardiovascular disease risk, and stroke risk for deaths are every bit the equal of cancer lethality from radiation exposure. This is another separate factor of 3 or more understatement of the risk.
Combining all of these, the death risk rate for the most vulnerable parts of the population (excluding fetuses) is something like 5.4% per rem of exposure, not the optimistic 4.5 x 10^-4 latent cancer fatalities per rem of exposure that is used as the standard. And this does not include immune system, gastrointestinal, or other impacts.
It goes on like this. There are other major flaws in the measurement and accounting for exposure to radiation that seriously impact the assessments. As a result arguing about the safety of nuclear is done against a severe effort by the industry to warp the scales in what they perceive as their favor.
This is not at all surprising. As was previously pointed out, the industry was born out of the second world war and the desire to build atomic weapons to beat the "other" guy, and to build warships powered by nuclear. Anything that might influence the population to agitate against those goals has to be stopped.
Bluice - as to your argument about CO2 emissions, you I suspect are arguing only about the CO2 emissions of operating reactors. It is common for the industry to entirely neglect the CO2 emissions required to design, build, and dismantle the plants, the CO2 release generated in the hugely energetically expensive effort to enrich the uranium fuel, to process spent fuel, and to dispose of both the plants and the fuel. These are never accounted for in estimates of the impact of nuclear versus other energy sources.
And on and on and on...
Arguments for next generation recycling reactors have even more severe flaws and omissions in impacts, difficulties and energetic inefficiencies, along with dramatically increased risks. Also not included is the ever increasing pool of fissile materials that pose an existential risk to all of mankind. There is simply no way to get rid of the stuff. Worse yet, as other nuclides decay away, it just gets better and better for use in nuclear weapons.
Arguments for other reactors, whether thorium fueled, molten salt, etc... have their own even worse problems. The thorium reactors have many serious issues. Just one of those is the ongoing production of U-233, and immensely weapons usable and weapons attractive isotope that can be slip streamed out of the operating reactor. Another is the use of beryllium rich salts in the reactor molten salt reactors that make it so exquisitely toxic that no one should ever consider it. For the mixed oxide fueled reactors, there are immense problems in recycling the fuel to usable form, and energetic and financial inefficiencies that render such reactors impossible to be made competitive in any market based system. Recycling weapons plutonium into reactor fuel similar has severe issues. The gallium alloying element used in the weapons is extremely hard to extract from the plutonium to levels require to make the fuel safe in reactors. The MOX fueled and especially plutonium fueled reactors have very narrow safety margins - at best. Etc...
All of this doesn't even begin to account for the operational, engineering and financial obstacles that have to be overcome for the wide spread use of nuclear power. Whenever one of these stations goes off line - it goes off line and it remains offline for over a month. During that time alternate power is required from other sources. For economic competitive reasons based on all of the costs on operating these beasts, they must run at 100% power always. They cannot be used as peaking plants. And since society exhibits a diurnal power use variation, they strip off all of the base load capacity available for all power sources. All of the others must then be able to vary power production from 0-100% daily.
That should be more than enough...