* Energy gain in this context only compares the energy generated to the energy in the lasers, not to the total amount of energy pulled off the grid to power the system, which is much higher.
I'm leaning towards the "in mice" but still a step forward.
The one from a few years ago was a big breakthrough in that it overcame the "effective" input energy, proving possibility. This is just a follow-up, so it's mostly testing/replication/improved yield, etc, so this one per se is not as great an achievement, unless your down in the weeds
Think of it as curing a very hard disease in mice, which had no success before this point, like rabies. It shows possibility rather than it just being a theory on a chalkboard.
The next large test (ICF is not thought to be scalable to achieve true parity) is the large scale iter tomahawk test in 5 years as it's scheduled for 2025, which is supposed to reach overall breakeven, though we will see.
The issue with applying any of this is going to be the longevity of the materials in use, in my opinion. Fission is hard enough on the containment in terms of wear and tear, fusion will have even more challenges. We are still permanently 20 years away from fusion in my opinion, lol.
I don't know about the physics, but it's an engineering "in mice".
They got more energy out of the plasma than they put in with the lasers, yes. But lasers are only a few percent efficient, and there are huge losses geting the power to the lasers from the capacitor banks (losses are proportional to the square of current, and we're talking very large currents). And there a further inefficiencies up the chain. Overall energy output would be well less than a thousandth of the energy put in. And it was generated only for a tiny fraction of a second.
Down-chain, actually being able to use the generated energy is its own chain of losses and inefficiencies.
TLDR; don't expect to see commercial fusion any time soon, or maybe in your lifetime.