Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The parents point is sound in that estimates only have validity if they are about well understood or frequently undertaken activities. There is no basis for certainty about decommissioning of nuclear; older plants especially have unique problems. There have been enormous cost overruns for any plant that has had an incident, and many plants will have problems only discovered upon disassembly.


Problems unique to older plants have no relevance to the estimates for newer plants. They're not even relevant to the choice whether to keep existing plants in operation, except insofar as pushing the decommissioning date further into the future lowers its net present cost due to the time value of money.


Ah but they do! As we discover those unique problems and how to solve them and how long it takes and the cost, our a priori cost estimate improves. The total numbers are so small it is really hard to say what the reliablity is. Newer plants tend to have more commonality and few parts, but we are still to discover all the many fantastic failure mechanisms. There are only 7 active Gen III reactors in the world, so it will be some time before we gain statistical confidence -- well after future designs are finalised.


> Ah but they do! As we discover those unique problems and how to solve them and how long it takes and the cost, our a priori cost estimate improves.

But it's not just our cost estimate that improves. Gen III reactors were designed with more data on how Gen I reactors were decommissioned than was available when Gen II reactors were being designed, and so on.

Finding a new decommissioning snag affects how you design the next reactor you build, but it doesn't really affect its predicted decommissioning cost because by that point you're aware of the issue and take steps to prevent it from occurring for the generation now being constructed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: