Do you think it's likely that the Standard Model cannot sufficiently explain superconductivity? Or alternatively, that it can be perfectly explained, but the actual analysis is just super difficult?
I find this interesting because pretty much everything else in day to day life seems to have been explained by the standard model minus gravity.
I would be extremely surprised if we needed anything beyond non-relativistic quantum mechanics coupled to classical electrodynamics to describe the next theory of superconductivity. If there is anything beyond that required, I would be absolutely stunned if it was more advanced than relativistic quantum electrodynamics.
The trouble with condensed matter physics has never been that the underlying phenomena are complicated. We’ve had a more sufficient grasp of the microscopic physics of materials since the 40s. I could write down the complete ‘theory of everything’ Hamiltonian for a condensed matter system in like two lines. The trouble, is not the fundamental building blocks, but how you take a mathematical description of electrons bumping into ions and then generalize that to 10^23 electrons and ions.
The game is to make an approximation that gets rid of the stunning complexity of the full theory without while still preserving the features that are relevant to the problem you’re trying to solve.
Imagine trying to understand how a modern computer running a video game works, but all you understand is the basics of logic gates. Sure, in principal you could understand what’s going on in terms of bit flips, but it’s hopeless in practice. Interacting, strongly correlated quantum mechanical systems are very literally exponentially more complex than a classical system like a computer.
>Imagine trying to understand how a modern computer running a video game works, but all you understand is the basics of logic gates. Sure, in principal you could understand what’s going on in terms of bit flips, but it’s hopeless in practice.
See "Could a Neuroscientist Understand a Microprocessor?":
Thank you. This is encouraging me to believe that huge materials science progress can be made in the next 50 years driven partially by computational advances.
That depends on what you mean by computational advances. If you mean advances in computer hardware, I wouldn't hold your breath on that helping much since these are generally exponentially scaling problems meaning that tiny incremental improvements in simulation fidelity or system size require doubling the computational resources.
Quantum computers, if large enough ones ever materialize could give us a revolution in computational many-body physics because they could actually circumvent this exponential scaling.
The most promising straightforward avenue though is actually just developing better algorithms that make more insightful, appropriate approximations to correctly capture the physics we care about. That may or may not be what you meant by computational advances.
> The trouble, is not the fundamental building blocks, but how you take a mathematical description of electrons bumping into ions and then generalize that to 10^23 electrons and ions.
Has anyone tried to generalize it to, say, 10^2 electrons and ions? What is the smallest system that empirically exhibits superconductivity?
Most Quantum Monte Carlo work on Fermionic systems deals with a number of particles on the order of 10^2. The numerical work suggest even such a small number can superconduct and we have seen experimentally that there are superconducting nano-particles that can be very small indeed. I don’t recall off the top of my head how many atoms are in the smallest ones we’ve seen though.
As far as pen-and-paper theoretical work goes though, something like 10^2 atoms is actually more difficult than 10^23 of them because 10^23 is basically infinite for our purposes so there are all sorts of useful limits one can take. When you consider something like 10^2 atoms, many useful approximations go out the window. You can no longer ignore the physics at the edge of the material since everything is very close to the edge, you can’t make assumptions about homogeneity, statistical arguments about the aggregate behaviour of electrons become unreliable, etc.
We known that machine learning gets attention of physicists these days, and although some of these applications look unrealistic or far-fetched, it sees to me from your description that condensed matter physics can be attacked from this direction, and it might be feasible.
You start with 1D problems, move to small 3D volumes, and build from there. ML is good at "interpolating" or guessing what the answer must be from a given input. Maybe, one will need many nested levels of ML models to realistically simulate a solid-state material, but this is not entirely impossible, I think.
If such an ML simulation can be done reasonably efficiently, then likely there could be a theory that can be formulated in terms of equations, approximating the "theory of everything" with sufficient detail.
A lot of people are trying to explore modern ML methods in physics, but they’ve so far had limited success at least in terms of interesting theoretical results. One glaring problem is that ML methods tend to be black boxes, so even if someone solved all of condensed matter physics with some super neural network, the question would remain “how and why was this able to work?” “What is going on physically?” which is what physicists tend to actually care about more than being able to make a prediction.
On the other hand, there is plenty of work on rule extraction from trained neural networks being done. Wouldn't it be amazing if we extracted actual laws of physics from a network trained by observations?
What I'm saying is that if an ML model can approximate quickly the solution, then there could be a simple theory expressed in terms of equations, approximating the full problem. I.e. if there is an efficiently computable procedure, then it's a sign that there might be a good simple approximating theory.
The standard model is basically a description of the fundamental forces, and is usually talked about by particle physicists studying things like the Higgs, quarks, etc. In condensed matter (materials) the only relevant interactions are electromagnetism and quantum effects, strong and weak interactions play no role because of the length scales at play (angstroms to nm) and the energy scales (meV to eV). Superconductivity involves phonons (vibrations) and electrons, at least in BCS, so you will never really find anyone discussing the 'Standard model' and superconductivity because the former is fundamental, while superconducting materials require consideration of emergent and many-particle effects which come about by having so many particles in a solid. Not sure I cleared anything up but hope that helps.
Like most of quantum chemistry, I imagine the Standard Model works just fine for simulating superconductivity to whatever degree of accuracy we (practically) need. The issue is finding an algorithm that can efficiently perform the calculation itself. I don't know much about condensed matter physics, but I know that for my own research in the past, something called the fermion sign problem (NP-hard) made accurate quantum chemical calculations extremely difficult.
"Computational Complexity and Fundamental Limitations to Fermionic Quantum Monte Carlo Simulations,
Matthias Troyer and Uwe-Jens Wiese, Phys. Rev. Lett. 94, 170201, 4 May 2005"
"Quantum Monte Carlo simulations, while being efficient for bosons, suffer from the “negative sign problem” when applied to fermions — causing an exponential increase of the computing time with the number of particles. A polynomial time solution to the sign problem is highly desired since it would provide an unbiased and numerically exact method to simulate correlated quantum systems. Here we show that such a solution is almost certainly unattainable by proving that the sign problem is nondeterministic polynomial (NP) hard, implying that a generic solution of the sign problem would also solve all problems in the complexity class NP in polynomial time."
These quantum Monte Carlo simulations are only necessary because we don't have quantum simulators [1], so yes while a quantum computer would not necessarily solve the sign problem, it would make it irrelevant.
QMC computes multiple superimposed versions of classical MC at the same time, giving a polynomial speedup, not good enough for NP hard problems.
QMC cannot speed up simulation of a different quantum system - unless you're simulating a strict subset of your specific machine.
Quantum simulator is not a QMC system - it is designed to exhibit same behavior as a modelled system. That makes it somewhat useless for a system whose properties you do not understand. Any grid of fermions does not act like any other - if it did, BCS version would be accurate.
I don't think you understand what quantum monte carlo is or why it works. Nearly everything you've said is just plain incorrect.
Furthermore, I never said quantum simulation was QMC. I said that having quantum computers able to simulate many-body Hamiltonians would make QMC irrelevant, or at the very least would not suffer the same limitations that QMC suffers, specifically for strongly correlated fermion problems.
I suggest getting a stronger grasp on the literature before you go around making claims like this. Whether you know it or not, you're actively spreading misinformation and making the internet a worse place.
I find this interesting because pretty much everything else in day to day life seems to have been explained by the standard model minus gravity.