That is right (if rather exaggerated, and I will note that it was you who originally picked the figure of two percent), and in practice, we accept a certain risk that we will not always have all the capacity we want, even though (or because) we cannot precisely predict how big or often these events will be. There is no particular reason to think this specific case is any different.
Why can't we predict how big or how often those events would be? We have clear understandings of the distribution of probabilities for all kinds of weather scenarios - see for example 1-50/100/1000 year flood/droughts.
I'm not saying we cannot do it, just that we cannot always get it right, and there is plenty of empirical evidence for that.
The second point is that the distribution has a long tail, especially when we consider the possibility of multiple independent incidents overlapping in time, to the point where it becomes infeasible to suppose that we could be prepared to continue operating as if nothing had happened in all conceivable scenarios, regardless of how accurately we could predict their likelihood.
I do not understand your argument We also cannot get right predicting the failures of fossil fuel generation. Sometimes multiple plants have outages that coincide and we have blackouts. Shit happens, and will continue to happen. Meanwhile we can make statistically rational plans.
We have coal fired plants in Australia with <90% uptime (often unscheduled), but somehow they're considered baseload rather than intermittent.
And I cannot figure out why you are saying this, as nothing I have said previously either contradicts what you say here, or is contradicted by it. If you could say what you think I am saying in my posts in this thread, we can sort it out.
EDIT: I see the problem starts with the first sentence of your first post here: “Why can't we predict how big or how often those events would be?” - which is completely beside the point in my response to rgmerk, who wrote “It's not clear (yet) what a 100% clean energy powered world would use to cover the last couple of percent of demand when loads peak and/or variable generation troughs for extended periods.” My response to this and the follow-up is this: a) if we are talking about two percent, we can overbuild the renewable capacity, and b) if we are considering all eventualities, there inevitably comes a point where we say that we are not going to prepare for uninterrupted service in this event.
No you didn't; you pointed out why it is not, in itself, a significant issue in the first place (which rgmerk tacitly seems to recognize in his first response, through pivoting away from the 2% claim.) My position on this has been that if the issue really is over ~2%, there is a simple solution.
I'll state it plainly: to get to the same level of reliability as the existing grid with just wind, solar, and batteries requires unacceptable amounts of overprovisioning of these at high latitude (or unacceptably high transmission cost).
Fortunately, use of different long duration storage (not batteries) can solve the problem more economically.
We can and do, and there are detailed plans based on those weather scenarios (eg for the Australian east coast grid; there is AEMO’s Integrated System Plan).
Things in the US are a bit more of a mixed bag, for better or worse, but there have been studies done that suggest that you can get very high renewables levels cost effectively, but not to 100% without new technology (eg “clean firm” power like geothermal, new nuclear being something other than a clusterfumble, long-term storage like iron-air batteries, etc etc etc).
The best technologies there are (IMO) e-fuels and extremely low capex thermal.
There are interesting engineering problems for sources that are intended to operate very infrequently and at very low capacity factor, as might be needed for covering Dunkleflauten. E-fuels burned with liquid oxygen (and water to reduce temperature) in rocket-like combustors might be better than conventional gas turbines for that.
It's mostly something I thought about myself. The prompting idea was how to massively reduce the capex of a turbine system, even if that increases the marginal cost per kWh when the system is in use, and also the observation of th incredibly high power density of rockets (they're the highest power density heat engines humanity makes). So, get rid of the compressor stage of the turbine, be open cycle so there's no need to condense steam back to water, and operate at higher pressure (at least an order of magnitude higher than combustion turbines) so the entire thing can be smaller.
You'd have to pay for storage of water and LOX (and making the LOX) so this wouldn't make sense to prolonged usage. On the plus side, using pure LOX means no NOx formation, so you also lose the catalytic NOx destruction system a stationary gas turbine would need to treat its exhaust.
I vaguely recall some people in Germany were looking at something like this but I don't remember any details.