There is some skepticism around this to say the least[1]. The core of it is that this group has found new bosons twice before:
The Atomki group has produced three previous papers on their beryllium-8 experiments — conference proceedings in 2008, 2012 and 2015. The first paper claimed evidence of a new boson of mass 12 MeV, and the second described an anomaly corresponding to a 13.45-MeV boson. (The third was a preliminary version of the Physical Review Letters paper.) The first two bumps have disappeared in the latest data, collected with an improved experimental setup. “The new claim now is [a] boson with a mass of 16.7 MeV,” Naviliat-Cuncic said. “But they don’t say anything about what went wrong in their previous claims and why we should not take those claims seriously.”
It's important to keep in mind that there are always plenty of outstanding experimental anomalies in physics. At the moment, this is one of ~40 roughly equally credible hints towards new physics, and it's more likely than not that all of those hints will fade away over time. That isn't anybody's fault either: it has always been like this, and it happens because experiments are difficult and subtle.
Personally, I still find this extremely exciting, even though history tells us it has less than a 1% chance of panning out. A 1% chance of revolution is still meaningful. But don't be too surprised if we land in the 99%.
How do physicists control model parsimony, so that the number of particles doesn't keep growing indefinitely? Would it be reasonable to think of the "true model" of the universe as having infinite complexity, and we are trying to approximate it by adding features and interactions stepwise? And then some of the non-standard models (ex. string theory) tear much of it down and start over?
Your question could deserve a book-length treatment, but one quick thing I can say is that the system is self-correcting. The more bells and whistles you add onto the Standard Model, the less anybody else will want to build upon it. And complexity doesn't rise steadily over time. Because of the LHC results, many models these days are a lot simpler (at least to me) than 10 years ago.
It is true that in addition to bolting on correct features, you occasionally need syntheses that simplify the whole thing, like how the zoo of mesons and baryons were explained by quarks. But at the moment, we're very far from having that problem...
There is no formal process for assessing the complexity of a theory or an interpretation, which leads to some interesting disagreements.
Consider the Many-Worlds Interpretation[0] of quantum mechanics. Its proponents say you need just a pretty small assumption (the Universe branches under certain conditions), and its critics say assumption is a huge deal, because it postulates an astronomically high (erm, cosmologically high) number of Universes that we can't observe.
> Would it be reasonable to think of the "true model" of the universe as having infinite complexity, and we are trying to approximate it by adding features and interactions stepwise?
Nobody knows.
So far, we have very successful theories (general relativity, quantum field theory) that rely on a small number of equations with a small number of terms each. That gives hope that at least the laws that govern the Universe are not infinitely complex, but we cannot really know for sure.
Looks like you have some good questions for [SE.Physics](https://physics.stackexchange.com/)! @knzhou's profile shows that they're a member there.
Short version's probably that we don't quite have enough data to compellingly argue for excessively elaborate descriptions.
I mean, yeah, every time there's any little observation -- even one that's not anywhere near statistically significant -- there's technically a tiny bit of information in it that could, in principle, be used to better inform a model than a model that excludes it. But at current, folks don't tend to care; too much mental/computational overhead for too little payoff.
Cows in Physicsland aren't spherical because physicists couldn't conceive of more precise descriptions, but rather because non-spherical cows were too much work.
I do not, but I can think of a few off the top of my head - the muon g-2 anomaly, the Antarctic upward-moving particle anomaly, the Hubble constant tension, the proton radius puzzle was one until quite recently (but is now largely resolved), the (not statistically significant yet, but they've been in the data since 2013 and they keep not going away) hints of lepton universality violation at the LHC...
Let's also add the PVLAS anomaly, the white dwarf cooling hint, the 'too big to fail' problem, the cusp/core problem, missing satellites, the DAMA/LIBRA anomaly, the MiniBooNE excess, the Valentine's day monopole, the reactor neutrino anomaly, the Hooperon, the AMS positron excess...
You can go straight to review articles, which are generally available for free on the arXiv. Or if you want the firehose, you can scan for papers whose abstracts contain words like 'hint', 'anomaly', 'excess', 'discrepancy', and so on [0]. No individual has made a global list across all fields, because this would be a thankless and kind of pointless task; nobody has the time to be interested in every anomaly.
That's a witty quip, but it's witty because it's unfair.
We don't grade theories based on what fraction of the stuff is obviously visible. If that were true, even electromagnetism would be a terrible theory because the vast majority of the spectrum we consider is invisible to human eyes. What always matters is how well the theory predicts given how simple its assumptions are, and dark matter is great at that.
Given how GR is entrenched now, confirmed so many times, it's pretty clear we're at worst in presence of a Newton-degree of understanding: maybe not all of it, but certainly true within boundaries. Whatever lies 'after' GR can only build upon what GR got right.
However in the case of dark matter, let's not forget that a model may be incredibly "right" and yet false, e.g. Bohr's atom; still taught today until you enter quantum stuff, but totally ok (valence etc) for basic chemistry. Newton is also totally ok for small non-relativistic speeds and masses, even though the model is ultimately wrong. Dark matter may also 'work' ok to represent and estimate 'basic galactic motion', but may be totally flawed as a model.
We would need to test the stuff itself, not its effects, to validate any kind of theory about it.
Isn’t the cosmological constant the OG fudge factor in GR?
AFAIK we need dark matter to explain any theory of gravity, the observations don’t match what we would expect form classical mechanics either which is still the go to theory for things like galaxies.
MOND was introduced initially to fix the observations mostly in regards to how we understand Newtonian mechanics, the relativistic versions of MOND can be used as an alternative to GR but these have been more or less debunked with the observations of gravitational waves since afaik all the versions of MOND have “instant gravity” just like Newtonian mechanics.
Consequently the rest of your last paragraph is incorrect.
"we need dark matter to explain any theory of gravity"
No. MOND is a theory of gravitation, but it's not relativistic, and it does not work at scales larger than that of galaxies (it is notably wrong with respect to the peculiar motions of galaxies in massive clusters, and a residual mass term must be added ibid. §6.6.4 : essentially, MOND still needs dark matter at galaxy cluster scales, even if it were to correctly describe all the individual galaxies in the massive cluster).
Additionally, General Relativity does not require dark matter any more than Newton's F=ma requires dark matter. The issue is that General Relativity's G=T like F=ma does not tell you about the initial trajectories, you plug those in by hand. If you start with the trajectories learned by observing galaxies (as Vera Rubin did), you can work out a stress-energy tensor that satisfies those trajectories -- and the majority of it has to be electromagnetically-uncharged, interacting very weakly or even only gravitationally (i.e., they can't clump or diffuse on the scales of mere millions of years), and slow-moving. If neutrinos weren't so inclined to zip around at speeds very close to that of light, they'd fit nearly perfectly; unfortunately, we're left trying to find the microscopic details of the unknown parts of the stress-energy tensor. Milgrom's MONDian approach to Rubin's discoveries that the orbits of "surface" stars are non-Keplerian was to turn "a" in F=ma into a function that depends on the radial distance from a galaxy's core; the function was found empirically, comparable to how the stress-energy distribution was found. However, it's the dependence on a coordinate distance that makes this approach non-relativistic.
(For terseness, in the paragraph above I've discarded some factors, set the constants c and G to unity, and omitted the greek-letter indices on the stress-energy and Einstein tensors (T resp. G). In the MOND context, F=ma is more appropriately written as in the first paragraph of §6 of Famaey & McGaugh.)
I'm not sure if I've either explained myself too simplistic or that you haven't understood my post.
My points were:
1) DM isn't a fudge factor for GR or Newtonian Mechanics, (Classical) Newtonian Mechanics doesn't have one, PPR does but PPR isn't a theory.
GR has a fudge factor built-in into the theory - the Cosmological Constant that gives you a variable that can adjust the predictions to match observations as it wasn't known at the time if the universe is static or not. It's not a perfect fudge factor since it can't deal with accelerated expansion easily but it's a fudge factor nonetheless.
2) MOND was based on classical mechanics and as such isn't a replacement for GR, I'm not sure if I agree with the assertion that TeVeS is an extension of GR. And yes all theories of gravity need "Dark Matter".
Dark Matter isn't a fudge factor it's a placeholder for missing mass needed to align predictions with observations doesn't matter if those predictions are derived from GR or Classical Mechanics or as you've mentioned even MOND.
And yes MOND still requires "Dark Matter" or to be exact some additional mass however it requires much less of it and it requires it to be concentrated in the center of galaxies which means it can be much more easily explained for through known mechanisms and forms of matter (e.g. the black holes in the center of a galaxy are more massive, higher density of interstellar medium and gas etc.), hence it can be described as a theory that "solves" the problem of Dark Matter, because it doesn't implicitly require new forms of matter or complex explanations for missing mass.
However like you've mentioned MOND is a flawed theory, it doesn't even work on galaxies that well since while it can describe their movement today, it has difficulties aligning with observations of various clusters and more importantly it doesn't work well when you start to wind back the arrow of time, and you don't even have to go as far back as to the formation of galaxies (which aren't possible under vanilla MOND) rather it can fail as quickly as by rewinding the clock half a billion years in some cases.
1/ I was restricting my comments to dark matter, sorry if you read a comment about dark energy into my reply; probably I should have made it explicit that I wasn't touching your "fudge factor" comment. Coincidentally, Sabine Hossenfelder has just published this in the past day or so: http://backreaction.blogspot.com/2019/11/what-is-dark-energy... and you might want to take your comment about "fudge factor" there, although I'd bet a doughnut without looking that at least two other assiduous commentators there have already done so. Hopefully someone in there will discuss Jeans instability and the Raychadhuri focusing theorem if it gets sufficiently technical: the latter was a later discovery which (if he had been aware of it) would have led Einstein (and practically everyone else) more quickly away from a steady-state cosmology and towards something like Einstein-deSitter or Einstein-Friedmann as a way of solving the problems arising in the former. One should also bear in mind that until 1922 M31 (then the "Great Andromeda Nebula") was not known to be comparable in size to the Milky Way nor at distances of more than a few thousand light years (as opposed to the modern figure of ~2.5 million), and that it wasn't until several years later that it became known that there are lots of galaxies in the sky -- this was all after the cosmological constant was introduced by Einstein in 1917.
2/ I will restrict myself to just:
"I'm not sure if I agree with the assertion that TeVeS is an extension of GR."
Bekenstein's tensor-vector-scalar gravitation (TeVeS) is a theory of gravitation, like MOND and GR. You're right that as originally formulated it's not laying a field on top of GR, but like numerous theories of gravitation it was found to be inconsistent with observation. In particular TeVeS does not allow for long-lived stars, and those appear in our sky in abundance. We also have lots of Einstein lensing data that conflicts strongly with TeVeS predictions. Amusingly one fix proposed for TeVeS for the "cosmic shear" weak gravitational lensing is to add a hidden mass term in galaxy clusters, with a specific proposal for a WIMP. (This doesn't fix TeVeS's other difficulties).
This is dealt with in Famaey & McGaugh §7.4 (wherein there is a delightful summary of a generalization of TeVeS: "... a tensor-vector-scalar theory with an Einstein-like metric, an Einstein-Aether-like unit-norm vector field, and a k-essence-like scalar field", which is close to saying it's "just GR with two extra fields that you are free to place on either side of the Einstein Field Equations" as you'll likely read from such non-particle-DM academics), and on textbook treatments of the Paramaterized Post-Newtonian (PPN) formalism (a convenient table is here https://en.wikipedia.org/wiki/Alternatives_to_general_relati... and surrounding text). Many of the PPN parameters are chosen so that they can be individually tested; hackernews user ISL does that for a living!
2(b)/ restricting to "the black holes in the center of a galaxy are more massive, higher density of interstellar medium and gas"
The work of http://www.astro.ucla.edu/~ghezgroup/gc/ and others put strong limits on the stress-energy in the central parsec, and more broadly in the core: your approach does not work in the Milky Way at all, and yet observations are increasingly consistent with a dark matter halo (see https://en.wikipedia.org/wiki/Dark_matter_halo#cite_note-32 and cite note 33 ibid., which review the Milky Way's rotation curve)
2(c)/ "you've mentioned MOND is a flawed theory". The theory itself is fine, it's just not a good match for extragalactic observations. I have zero problem in using pure Milgromian MOND in studies of the doppler shifts of molecular gas clouds in LSB elliptical galaxis, for instance. But it doesn't work at all in cosmology, and is grossly wrong in the solar system and in mergers of massive compact objects, and "fixes" for those regimes are even more unwieldy than the GR-based linearizations and other GR-based post-Newtonian expansion techniques already in use.
Finally,
"wind back the arrow of time"
What does that mean?
From context I think you are just saying that large scale structure formation is not adequately explained by theories that lack some form of dark matter, but "the arrow of time" means something to cosmologists (and physicists generally).
No, there are other reasons[1] we believe dark matter exists, it's not just a fudge factor in the math. There are multiple lines of evidence pointing at its existence.
Why not? They describe completely different phenomena, two different things we do not understand. There is no reason to think that they might have a single underlying solution, nor is there any theory that points to one.
Science can have more than one mystery, with independent leading solutions. It would be like historians declaring that they could believe in the Sea Peoples or Jack the Ripper but not both.
I find the E8 lattice incredibly interesting, if for no other reason to find out if it's truly representative of particle physics. It predicts more particles we haven't observed yet.
I felt like the idea got a bit of a bad rap because of the pun in the title inadvertently created a lot of media publicity of the sort "Surfer dude discovers simple answer to everything"
The E8 structure does make for a compelling idea, while I'm not aware of any particular evidence that makes it logically any better than existing ideas. It certainly is aesthetically pleasing.
I only which my math skills were up to calculating the properties and values of particles based on the idea so I could better understand what people should be looking for to prove/disprove it.
Honestly, even if the E8 lattice is nothing but imagination, it's so beautiful that we need to make a para-verse like that someday. You know, when we've mastered creating universes and stuff, at least when we're able to live in virtual fluff. The beings in such a simulated universe would be so in awe of 'everything' as they uncover the theory thereof.
We're going to need an independent experiment to see this "X17" before anyone truly believes it's a new particle. Some example major particle discoveries over the last few decades (and the independent experiments which observed them): Higgs (ATLAS & CMS at CERN); Top quark (CDF & D0 Fermilab); W & Z Bosons (UA1 and UA2 at CERN); J/Psi (Richter's team at SLAC & Ting's team at MIT). Some other fundamental particles were observed by a single experiment (the bottom quark, for example) and the community believed it, but in those cases it's always because a theory has already predicted the particle, and we've always been able to study them with future experiments anyway. So this won't gain traction unless another experiment can observe it.
Nobody should argue that advancements haven't been made since 1988, but nobody's manufacturing organic human organs yet --- chemical pieces of: probably.
I have a faint understanding of the "Standard Model" of physics, which lays out a pattern of particles and forces that has so far held up to experimentation i.e. the Higgs boson fit neatly into the model, exactly where the model predicted it would go.
The standard model predicted the Higgs, and the Higgs was found exactly how the model predicted, right? But I've never heard of this X17 particle before. Is this something the standard model has predicted? Does it go against the standard model, strengthen it, or neither?
Not necessarily, because it could give a hint. It is possible to get from various higher dimensional gauge groups like SO(10) to almost the standard model. But the left over particles often couple in unfavourable ways and lead to proton decay for example.
every single article on the matter has says that it would be a step towards potentially explaining dark matter and dark energy, which would help towards a unified theory
I hope to see a lot of things in my lifetime: life discovered outside our planet, human cloning, artificial organs and blood, AGI, humans on Mars, cures for a broad spectrum of cancers and Alzheimer's, human life extension / slowing aging, broad advances in chemistry and materials science, major advances in metabolomics, protein folding, drug discovery, gene therapies and repair...
> I hope to see a lot of things in my lifetime: life discovered outside our planet, human cloning, artificial organs and blood, AGI, humans on Mars, cures for a broad spectrum of cancers and Alzheimer's, human life extension / slowing aging, broad advances in chemistry and materials science, major advances in metabolomics, protein folding, drug discovery, gene therapies and repair...
The same list of things people were 'hoping for' in 1988. (From personal experience; probably a high estimate.) We're not a iota closer here in 2019.
Wishful thinking isn't enough; no matter how hard you hope and how much money you waste, you're not going to make a fundamentally better airliner or horse buggy.
What? Human cloning is basically possible but illegal, we can absolutely grow artificial organs (intestine, heart, certain brain structures), life extension and diseases cures are closer via Crispr, metabolomics has advanced, hospitals use glycomics for diagnosis, gene therapy is used in clinical trials, machine learning for materials science, drug discovery and protein folding is coming along.
What is your definition of iota? We're not there, but to say there's no progress is just untrue.
The Atomki group has produced three previous papers on their beryllium-8 experiments — conference proceedings in 2008, 2012 and 2015. The first paper claimed evidence of a new boson of mass 12 MeV, and the second described an anomaly corresponding to a 13.45-MeV boson. (The third was a preliminary version of the Physical Review Letters paper.) The first two bumps have disappeared in the latest data, collected with an improved experimental setup. “The new claim now is [a] boson with a mass of 16.7 MeV,” Naviliat-Cuncic said. “But they don’t say anything about what went wrong in their previous claims and why we should not take those claims seriously.”
[1] https://www.quantamagazine.org/new-boson-claim-faces-scrutin...