Ah yes nothing simpler than providing the foundational theory to one of the most rigorous and intellectually intimidating areas of mathematics - number theory /s
Well, the sequence 0, 2, 4, ... , 2k, ... is indeed simple, can be recovered starting from the value at an arbitrary index (eg the last one announced). As can (3k), (5k), etc...
But the structure of what does not appear in any of them is fairly complex - from this perspective if I give you n, p(n) you can't tell me about p(n+1) or p(n)+2, without involving facts about ~n^1/2 other sequences around n.
Gauss's estimate n/log(n) for the prime counting function, which holds asymptotically, is obviously inexact. As is the logarithmic integral. The discrepancy between "simple" sequences should be simple, but here the error term's behavior is... hardly that.
With respect, this is an epic undertaking. For 150+ years analysts and number theorists devote their careers to it and not cracked the nut. Although there has been great progress.
Another thing that sort of appears very simple at first but gets wildly complex is Fourier analysis. It's just a way of writing functions with a trigonometric basis. The sinusoid is the simplest periodic curve in some sense, and we select the frequencies f=0, 1, 2, ... Okay but this is a basis for... what? It's not simple. Another 200 years.
The two are connected. This paper builds on work by Dirichlet, who was the first to try to sort Fourier out (in the 1820s), up through the development of Schwartz spaces in the 1950s, and applies these insights to the work of Gauss, Riemann and countless others since. And we still don't understand the structure (eg bounds) of the error term!
Yes, it’s difficult to predict where such an understanding might lead. If it reframes and redefines all of number theory, then we might call it one component of the foundational theory of number theory.
Analogously, if someone proves that P = NP, then that will be great, but the significance of lambda calculus and Turing completeness will remain. If the proof is constructive and practical, we’ll just have to reprioritize and update the list of algorithms we teach to undergrads, issue performance-enhancement updates to some software libraries, and patch any security vulnerabilities. Otherwise, we’ll only need to change a chapter or two in the Theory of Computation courses that universities are increasingly deprioritizing.
> we’ll just have to reprioritize and update the list of algorithms we teach to undergrads, issue performance-enhancement updates to some software libraries, and patch any security vulnerabilities.
Wow, your optimism sure is something.
What are you patching and with what?
How do you “patch any security vulnerabilities” when said vulnerability is “all of our security research, except one time pad, is now so obsolete we are unsure if security based on computational complexity is at all practical anymore”?
Gosh it was early in the morning and somehow I was thinking in terms of factoring prime numbers when I added the security point. But consider cryptography as an application of number theory and compatibility theory.
Interestingly, if there are cryptography alternatives that can still be relied upon if factoring primes is easy, but the same does not hold if P = NP in a practical sense, then that’s further support for the primary point that learning more about prime numbers would not reset the foundation of number theory.
Nah, it's not that complex. Did you ever take introductory number theory? Anyway, I think you missed the point that the description is really simple, and overlooked. Hahaha! :)