Perhaps not in a 100% world, though I'll give you the point that they are useful now.
In a 100% renewable world we would not be extracting or refining oil. Natural gas (used by these turbines) is a byproduct of oil drilling. Were we not burning the oil, the natural gas might be too expensive alone.
Also, in a 100% renewable world we would (by definition) have enough generation all the time - (covered by batteries and good baseload sources) that turbine power was no longer required to cover peak loads.
It's not clear (yet) what a 100% clean energy powered world would use to cover the last couple of percent of demand when loads peak and/or variable generation troughs for extended periods.
It'll be some combination of demand management (which isn't nearly as horrifying as people make it out to be), pumped hydro, long-duration batteries like iron-air, but also possibly burning hydrogen or hydrogen-derived synthetic fuels (produced by electrolysis when hydrogen is abundant) and/or biofuels in turbines.
Somebody calculated that a home in UK needs 1 Megawatt-Hour battery to backup solar energy during the winter. I suspect in 10 years that may cost below 25K, a small fraction of the property cost.
But is it really 1 MWh of _electricity_, or could you replace a good chunk of that with a huge tank of boiling water? In the winter, about half of my electricity consumption goes to my heat pump, to produce 45-50C water for heating and tap water. But if we could increase the reservoir temperature to 95C (or even go superheated to 160C at 6 bar), then it could supply the 45-50C flow temperature much longer without needing to recharge.
That's probably assuming a solar system sized to cover typical summer energy usage. You can simply over-provision solar until you have wasted capacity in summer and little to no storage requirement in winter. Then it's just a tradeoff between battery and solar costs to find the best price point.
Also this calculation probably assumes no baseload power imported from the grid, where means such as wind and tidal power work year-round and help offset the need for batteries.
That is right (if rather exaggerated, and I will note that it was you who originally picked the figure of two percent), and in practice, we accept a certain risk that we will not always have all the capacity we want, even though (or because) we cannot precisely predict how big or often these events will be. There is no particular reason to think this specific case is any different.
Why can't we predict how big or how often those events would be? We have clear understandings of the distribution of probabilities for all kinds of weather scenarios - see for example 1-50/100/1000 year flood/droughts.
I'm not saying we cannot do it, just that we cannot always get it right, and there is plenty of empirical evidence for that.
The second point is that the distribution has a long tail, especially when we consider the possibility of multiple independent incidents overlapping in time, to the point where it becomes infeasible to suppose that we could be prepared to continue operating as if nothing had happened in all conceivable scenarios, regardless of how accurately we could predict their likelihood.
I do not understand your argument We also cannot get right predicting the failures of fossil fuel generation. Sometimes multiple plants have outages that coincide and we have blackouts. Shit happens, and will continue to happen. Meanwhile we can make statistically rational plans.
We have coal fired plants in Australia with <90% uptime (often unscheduled), but somehow they're considered baseload rather than intermittent.
And I cannot figure out why you are saying this, as nothing I have said previously either contradicts what you say here, or is contradicted by it. If you could say what you think I am saying in my posts in this thread, we can sort it out.
EDIT: I see the problem starts with the first sentence of your first post here: “Why can't we predict how big or how often those events would be?” - which is completely beside the point in my response to rgmerk, who wrote “It's not clear (yet) what a 100% clean energy powered world would use to cover the last couple of percent of demand when loads peak and/or variable generation troughs for extended periods.” My response to this and the follow-up is this: a) if we are talking about two percent, we can overbuild the renewable capacity, and b) if we are considering all eventualities, there inevitably comes a point where we say that we are not going to prepare for uninterrupted service in this event.
No you didn't; you pointed out why it is not, in itself, a significant issue in the first place (which rgmerk tacitly seems to recognize in his first response, through pivoting away from the 2% claim.) My position on this has been that if the issue really is over ~2%, there is a simple solution.
I'll state it plainly: to get to the same level of reliability as the existing grid with just wind, solar, and batteries requires unacceptable amounts of overprovisioning of these at high latitude (or unacceptably high transmission cost).
Fortunately, use of different long duration storage (not batteries) can solve the problem more economically.
We can and do, and there are detailed plans based on those weather scenarios (eg for the Australian east coast grid; there is AEMO’s Integrated System Plan).
Things in the US are a bit more of a mixed bag, for better or worse, but there have been studies done that suggest that you can get very high renewables levels cost effectively, but not to 100% without new technology (eg “clean firm” power like geothermal, new nuclear being something other than a clusterfumble, long-term storage like iron-air batteries, etc etc etc).
The best technologies there are (IMO) e-fuels and extremely low capex thermal.
There are interesting engineering problems for sources that are intended to operate very infrequently and at very low capacity factor, as might be needed for covering Dunkleflauten. E-fuels burned with liquid oxygen (and water to reduce temperature) in rocket-like combustors might be better than conventional gas turbines for that.
It's mostly something I thought about myself. The prompting idea was how to massively reduce the capex of a turbine system, even if that increases the marginal cost per kWh when the system is in use, and also the observation of th incredibly high power density of rockets (they're the highest power density heat engines humanity makes). So, get rid of the compressor stage of the turbine, be open cycle so there's no need to condense steam back to water, and operate at higher pressure (at least an order of magnitude higher than combustion turbines) so the entire thing can be smaller.
You'd have to pay for storage of water and LOX (and making the LOX) so this wouldn't make sense to prolonged usage. On the plus side, using pure LOX means no NOx formation, so you also lose the catalytic NOx destruction system a stationary gas turbine would need to treat its exhaust.
I vaguely recall some people in Germany were looking at something like this but I don't remember any details.
The problem is the last two percent isn't evenly distributed in time, but rather occurs rarely, but in large chunks. On average it's 2%, but not at each point in time.
Also, if solar ends up much cheaper than wind there's going to be need for seasonal energy storage, which could be considerably more than 2% at high latitude. Batteries are unsuitable for this.
Particularly with the development of fracking, natural gas production is no longer a just a byproduct of oil production, and can be (and is) pursued independently. Nevertheless, I agree that we developing renewables should be our priority.
In a 100% renewable world we would not be extracting or refining oil. Natural gas (used by these turbines) is a byproduct of oil drilling. Were we not burning the oil, the natural gas might be too expensive alone.
Also, in a 100% renewable world we would (by definition) have enough generation all the time - (covered by batteries and good baseload sources) that turbine power was no longer required to cover peak loads.