It will be more relevant in the future, but it's still worth thinking about. Right now intermittent energy sources cover around 15-18% of the total energy consumption in Germany[1]. And seasonal variability is covered by other methods (natural gas and others).
But since 2/3rd of the fossil energy is wasted, it's more like 40% of the useful energy.
Some people don't know about the primary energy fallacy, others know about it and try to exploit it, so you should be suspicious of the opinions of anyone trying to use it suggest lack of progress and futility.
If you measure useful energy as electricity output of a fossil fuel plant then yes. But in many cases the waste heat is used in other applications for example district heating or low grade industrial heat.
If you use fossil fuel to directly drive an industrial process, for example melting of ores/metals/glass then the efficiency is much higher.
Electricity can still be more efficient for many of these with heat pumps, like indoor heating and steam production. The gap is smaller then for working engines of cause.
And Germany right now have battery storage equivalent of fully powering Germany for about 30 minute sand raising up every month, which is quite wild.. https://battery-charts.de/
With those feeding on negative-priced electricity, intermittent sources will only get more economical to the detriment of gas and nuclear.
Every hour you don't run your nuclear power plant at full capacity you lose money. Nuclear power is mostly capex. You need to maximize utilization if you want to be profitable.
It's far worse not to have sufficient electricity during the night or on overcast days. You can just increase nuclear electricity prices during that time to make up for the lost revenue from sunny days.