Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why there is no Hitchhiker’s Guide to Mathematics for Programmers (jeremykun.com)
108 points by dominotw on Sept 8, 2013 | hide | past | favorite | 102 comments


While there is no "royal road" to mathematics for programmers who don't care about proofs, there is a programmer's road to proofs for those interested in Math.

In fact, the book is even called "The Haskell Road to Logic, Maths and Programming" [0]. It covers mathematical notation, proof construction, and lots of interesting portions of discrete math that should be of interest to programmers. And large portions of the results are demonstrated or used in interesting Haskell programs.

[0] http://www.amazon.com/Haskell-Programming-Second-Edition-Com...


The Haskell Road to Logic is a pretty good.

I took a class on discrete mathematics only with formulas. I didn't understand any of it.

Then I read this book and all of it made sense.

"What I cannot program, I cannot understand.." (with my apologies to Feynman)


That book is on my to-read list, yet I don't know what to think of a book that starts like this: http://i.imgur.com/rSc0fpw.png


The validity of that statement depends on your historical perspective.

To the modern eye, we see radically different type systems and conclude they are as different as night and day.

But there was a time when the world was divided into people who thought anything higher-level than C or FORTRAN was stupid and impractical, and those who aspired to build more abstract languages. LISP was one of the rallying cries of the latter group. Haskell clearly falls into that group too.


And before that, there were the people who thought anyone using an assembler was wasting precious CPU cycles, and something like FORTRAN was considered heresy.


I' m not sure it makes a ton of sense either, even for an older book. OTOH, knowing a lot about programming language history/politics and knowing a lot about math and rigorous CS theory tend to be negatively correlated in my experience.

I know I should be demonstrating that assertion with hard numbers, but in my defense I do know a lot about programming language history and politics. ;)


Hm, that makes some sense, the book might be old, that's why.


It can't be that old, since according to Wikipedia Haskell itself is only about 23 years old. http://en.wikipedia.org/wiki/Haskell_(programming_language)


Both Haskell and LISP are in the "top-down" approach (approaching programming from an abstract angle, without concerns about the underlying machine), as opposed to the "bottom-up" approach (where you are explicitly managing I/O, memory, and the underlying machine is not abstracted away), like C or Ada.


    lambda calculus --> Haskell
                   \--> LISP --> Scheme
                            \--> Common Lisp


The Acknowledgements has an interesting anecdote - Thierry Coquand, apparently, found the original lecture notes that would become the book out the blue and seems to have offered quite a bit.


I would add that Project Euler (www.projecteuler.net) is a fun place to practice.

To your point, many of the best solutions are done in Haskell. They make it seem so easy!


I like how the author mentions having an executable mathematical language might be helpful. Math is a non-standard non-machine-executable programming language which means it's inherently buggy. On top of that, now I need to interpret the "code" while I'm learning and it can be pretty dense.

Much like in programming, I'm sure expert mathematicians may not see why it's so painful, but wait until a more robust language/system is developed for the hobbyist. It will be extremely disruptive.


What? Due to the Curry-Howard Correspondence, mathematical proofs are in fact very precisely executable: http://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspond...

There may be lots of room to improve the "UI" of math [0] but I don't see how you could desire more power or verifiability than what modern mathematical proof systems offer.

[0] http://worrydream.com/KillMath/


Only proofs in 'constructive' logic. There's a lot of proofs that aren't executable. Most importantly proofs by contradiction.

See: http://en.wikipedia.org/wiki/Intuitionistic_logic

In practical terms, however, this isn't a huge hurdle and comes with the nice advantage that you always have a program.

BTW. I'm kinda sure you know this already, but I think it is important to mention.

Thanks for the Haskell proof-book link. I'll check it out sometime.


Hmm, I don't understand the problem. There is a philosophical question there, of course, but as long as you assume something like the law of excluded middle, you can easily set up a proof system allowing you to execute proofs by contradiction. This is a common strategy in Coq, for instance.

Maybe I'm not understanding your point, though. I also don't think this chain of conversation is the sense that darkxanthos meant, either :)


The idea is that the Curry-Howard isomorphism allows you to "play" with constructive mathematics easily. For example, consider the fact that the composition of two functions f and g is continuous if each function is continuous. Suppose f : X -> Y and g : Y -> Z. Then, in pseudo-Haskell syntax, we could say

  continuousf  : Open set in Y -> Open set in X
  continuousg  : Open set in Z -> Open set in Y
  continuousgf : Open set in Z -> Open set in X
  continuousgf = continuousf . continuousg
The idea is that the fact that f is continuous means that we can produce open sets in X given open sets in Y. Now, consider the fundamental theorem of algebra over complex numbers. This says that any polynomial with complex coefficients with degree greater than 0 has at least one complex root. We'd like to have a function

  fta : Polynomial P of degree >0 over C -> (Point x in C, proof that P(x)=0)
which would compute this root. But the thing is, the standard proof (e.g., by Louiville's Theorem) of the FTA is not constructive. The proof is by contradiction: Suppose there were no root. Then we violate some principle of complex analysis that we know is true (Louiville's Thm, or Maximum Modulus Principle, for example). Thus there must be a root.

But this doesn't tell us at all how to compute the root! So instead we need a constructive proof of the FTA in order to produce a root.

So, the idea is that you can't "play" with non-constructive mathematics in programming as you can with constructive mathematics. Our standard FTA proof would look like

  fta : Polynomial P of degree >0 over C -> Exists x. P(x)=0.
But the type of the object that this function produces isn't very fun to play with. We have no idea what the root is!


I think this is unfair: you're comparing a trivial observation to a nontrivial theorem, and complaining why the complicated thing isn't as easy to play with as the trivial thing.

The Curry-Howard isomorphism really doesn't add anything to the conversation that wasn't already there. Mathematicians already know what it means for a proof to be constructive or not constructive, and they know it's easier to work with constructive proofs.

Besides, we already know that there is no general way to compute roots of arbitrary polynomials using elementary arithmetic. What you really want is a procedure to find a description of such an x (to arbitrary accuracy, disregarding efficiency), and these are a dime a dozen.


> Mathematicians already know what it means for a proof to be constructive or not constructive, and they know it's easier to work with constructive proofs.

It is often more easy to prove that some object exists (by contradiction) than to construct such an object.

For example: by applying Zorn's lemma it is easy to prove that any vector space has a Hamel basis. But it is often non-trivial to construct such a basis.


> Math is a non-standard non-machine-executable programming language which means it's inherently buggy.

I don't understand what you mean by non-standard. I can't think of any language more standard than mathematics.

Mathematical theorems are formally proven. That is a higher form of knowledge than empirical validation (running programs against tests). Another case of empirical validation is scientific experimentation. It validates theories by increasing our assessment of their plausibility, but never up to 100%. Mathematical proofs are the only completely objective form of knowledge. Automatic theorem provers do exist and are employed by mathematicians, but they are inherently limited, as proven by Gödel's undecidability theorems.


Mathematical theorems are formally proven.

It's formal compared to colloquial english, but completely vague compared to computer programs.

There are automatic theorem provers, but it isn't Godel's undecidability theorem that limits them. It's the difficulty of formally writing down our proofs in machine readable format that limits them for practical purposes.


Mathematical proofs are completely precise, that is to say, not vague in the least. They are, however, in an extremely high-level language; it's left to the reader to expand the notation enough to convince themselves of the validity.

(And, of course, there can be bugs - that is, mistakes - but there is no ambiguity.)


So the underlying ideas are precise, but they're expressed in imprecise language? That's true of all communication.

When I ask my wife for "that thing by the door," I know exactly what I mean, but she has a to do a lot of work to decode it.


>I don't understand what you mean by non-standard.

Symbols in advanced math tend to be very poorly defined and contextual. Imagine programming if every function had a two-letter name and the list of which packages to import was left out half the time.

This is covered in the article.

Proofs formatted for automatic solvers show that they can be standardized. Now think about the tiny percentage of proofs that are in this form.


> Proofs formatted for automatic solvers show that they can be standardized. Now think about the tiny percentage of proofs that are in this form.

Just as important, though, is the number of proofs that anyone would want to read in that form. I can't think of anyone who has ever claimed that reading formally verified proofs aids understanding...


Right, the current methods are probably awful for reading. It's a proof of concept more than anything, when it comes to using standard terminology.


> Automatic theorem provers do exist and are employed by mathematicians, but they are inherently limited, as proven by Gödel's undecidability theorems

Human mathematicians are equally limited by Gödel.


> To learn mathematics from scratch. The working programmer simply doesn’t have time for that.

I disagree. Most of high school math and calculus can be learned fairly quickly (like weeks). This is what I have been trying to do with my book "No bullshit guide to math and physics." It has been very successful with coders: http://minireference.com/

Also on the topic math \cap code, here is an excellent talk by Gerald Sussman: http://www.infoq.com/presentations/Expression-of-Ideas


The problem is that high school math and basic calculus is not much mathematics at all, and it still has the problems of foreign notation and ambiguity.


HS math and calc a is some math, definitely not all, but a good chunk of what I would call "applied math"---the math that links directly to the real-world, meaning the concepts can be understood intuitively, e.g., the connections to physics.

The trick is to covertly throw in some formal proofs to prepare the reader for the more abstract stuff. Going directly into proofs might be too much of a jump for some people, though I agree that, ultimately, this is what mathematics is about.

As for the foreign notation, I wrote an appendix which explains how to read things: http://mcgillweb.alwaysdata.net/notation_appendix.pdf


> HS math and calc a is some math, definitely not all, but a good chunk of what I would call "applied math"---the math that links directly to the real-world, meaning the concepts can be understood intuitively, e.g., the connections to physics.

IMHO there's little mathematics that can't be applied to the real world rather directly. I even heard mathematicians say that the distinction between pure and applied mathematics is mostly for historical reasons. Especially if you consider connections to physics as intuitive understandability - for example in string theory or quantum field theory you'll find lots of highly complicated mathematics.


I even heard mathematicians say that the distinction between pure and applied mathematics is mostly for historical reasons.

It must have been applied mathematician, for I cannot even imagine pure mathematician uttering anything like this.

I mean, seriously, I know quite a lot of modern pure mathematics, and for most of it I cannot picture even far-fetched connection with anything existing in real world, not to mention an actual application in solving some problem that wasn't created only for this application.

Most of the time, for almost every field of mathematics, the way it works is that for an enormous amount of knowledge, and enormous amount of research happening and results and papers being published, only very, very small amount will actually get applied any time soon (soon as in next 200 years). For some fields, like calculus, or probability, or partial differential equations, amount of applicable stuff is larger (mostly it's just old, one or two centuries old stuff anyway), and for other fields, like say homological algebra, or algebraic topology, or descriptive set theory, the applicable stuff is almost nonexistent - I'll buy a beer to anyone who'll point me to an application of descriptive set theory to any real life problem.

Almost all of the mathematics existing is purely abstract, and not applicable to real world, and I find it hard to even imagine anyone trying to argue otherwise. One can argue that what now is considered abstract and not applicable can become very useful in real life problems in future, and indeed, it happens quite often, but I think it will not be much, since we're applying to real life problems only a small fraction of 200 years old mathematics today anyway.


I don't think it's fair to say some idea is not applicable because you don't know of an application. Number theory was like that before the advent of computers, and I'm a firm believer that there are thoughts that cannot be thunk until the right framework comes along to allow it.

That being said, a lot of these pure subjects do have applications. For example: algebraic geometry (the crown jewel of pure mathematics, my colleagues would have be believe) has applications to tons of industrial problems in the form of solving systems of polynomial equations (see homotopy continuation). Algebraic geometry has also been applied to robot motion planning, etc.

Descriptive set theory is applied in functional analysis and in ergodic theory, which in turn is applied to statistical physics. Not to mention that descriptive set theory is the pure-logic equivalent of computational complexity theory, and that there is potential to connect the two fields and resolve some big open problems (though it's doubtful that P vs NP will be resolved this way).

And almost all of modern physics is based on more or less modern mathematics: tensor analysis and other flavors of linear algebra, lie theory, etc. Algebraic topology is starting to find some traction in the subfield of persistent homology, which aims to study high-dimensional data sets in the context of homological algebra. I even gave a talk earlier this year on the concrete attempts people have made to apply persistent homology to real-world problems [1]. It's still an extremely young field, but shows some promise.

I'll give you that mathematics is extremely abstract, because I believe it. But to say that it's not applicable and being applied wherever possible is a bit naive. And to say that it's only 200 year old mathematics is to ignore the most applicable fields which did not exist even a hundred years ago: combinatorial optimization, mathematical computer science, and modern statistics and probability theory.

[1] http://jeremykun.com/2013/04/27/persistent-homology-talk-at-...


I must have not communicated what I meant clearly, because you missed my point.

Only a very, very small fraction of results in number theory are applicable. Only a very, very small fraction of stuff done in algebraic topology is applicable. While descriptive set theory is indeed sometimes applied in functional analysis, the intersection between these two is not a significant fraction of each one of them, and by the time you apply (a very small fraction of) functional analysis to statistical physics, you're already too far from descriptive set theory to even see it on the horizon.

I'm not saying that none of the mathematics is applicable, because this is obviously not the case. What I'm saying is that the stuff that gets applied to real life problems is surprisingly small, even more so when you're not a mathematician.

When I first started to learn mathematics, I was completely overwhelmed, when I found out just how much knowledge is out there in this field of human activity. The planes of mathematics are so vast, there's almost nothing else in sight when you stand atop of the Mount Bourbaki. I realized that even if I get a PhD in pure mathematics, I will still only be able to learn less 1% of mathematics ever created in my whole lifetime. That's why when I hear people saying that all math can be applied, I think that they must have not realized just how much of the stuff is in there.

I used to specialize in algebraic topology, and when I first learned about persistent homology, I encountered a home page of the professor at some US faculty, who specialized in applying algebraic topology in real life, and the first reaction of me and my classmates whom I have shown his website was not appreciation of his results. We were totally amazed that this stuff can be applied to anything _at all_. Yeah, some of these applications were to the problems that are usually solved in a better way, some problems were very contrived and seemed to actually be tailored so that one can apply algebraic topology to them, but still these were very fine and interesting results, and we were very surprised by them.

The field of algebraic geometry actually makes an interesting example. Indeed, it is considered by many one of the most, if not the most abstract field of the mathematics. Initially, at the beginning of the previous century, people were mostly concerned with studying the sets of solutions of systems of polynomial equations. David Hilbert with his landmark results being his basis theorem and Nullstellensatz laid fundamentals to this field, and because of an essential assumption in the Nullstellensatz theorem that creates a bridge between algebra and geometry, for many decades most of the results concerned only polynomials and sets of solutions in algebraically closed fields, because for anything else, the apparatus was just lacking. Algebraic geometry didn't have a reputation of an extremely abstract field, especially since sets of solutions to systems of polynomial equations are quite natural objects, and it's easy enough to imagine their occurrence in real life problems.

In 1950s and 1960s, though, the field was completely and utterly revolutionized by Alexander Grothendieck and his school, and that's when the field gained its fame of being very esoteric. Grothendieck's methods and approach allowed algebraic geometers to tackle vastly bigger range of problems, and ultimately to efface the distinction between algebra and geometry, at the price of making things much more abstract and distanced from more concrete considerations. That's when algebraic geometry expanded its reach, to encompass a large amount of research in abstract algebra and number theory.

In the meantime, another interesting thing happened in the field: the advent of computational techniques. Things like Groebner bases really pushed things forward and made a lot of theoretical stuff actually doable in practice. This is mainly what made things you mention in your post possible: while many industry problems could be formulated in geometric terms earlier, only rise of computational methods actually allowed us to solve them.

The point here is this: algebraic geometry field consists of two parts, the older and more concrete, which can and is (relatively) frequently applied, and the newer, but more abstract, of which applied is very little -- with notable exceptions though being for instance finite elliptic curves with well known application to cryptography, or Calabi-Yau manifolds which are intimately connected to the string theory. Nevertheless the applied stuff constitutes only a small part of the field, the rest is just pure and abstract mathematics, without any applications whatsoever, though it's still worth noting that algebraic geometry is still relatively very good in the amount of applicable stuff, and fields like algebraic topology, not to even mention descriptive set theory, fare much, much worse with regard to it. For me, a bit naive is to think that big part of mathematics will be applied, when there's so much of it.


I'm aware of the categorical revolution, since I'm young enough to have been (mathematically) raised on that perspective. Some of the more abstract algebraic geometry is actually coming back to computational applications. See, for example: http://www.researchgate.net/publication/226664601_From_Oil_F...

To say that something is not applicable is hard to argue. And besides giving examples of when it is applied (and you claiming it's still an unimaginably small fraction of mathematics) all I can say is that the understanding of some object can provide insights and applications in unexpected ways. Dynamical systems inspire computer graphics, Mobius bands inspire carburetor belt design, category theory inspires Haskell... I just don't think it's fair to ask for the immediate applications of any given theorem because the ultimate application is understanding what's going on.

But thanks for the great discussion! :)


Unluckily I know little about homological algebra, algebraic topology and descriptive set theory.

But I surely have already seen papers where methods from algebraic topology are applied for pattern detection or finding whether some high-dimensional data are homeomorphic.

Also simplicial complexes (that are used in homological algebra and algebraic topology) can also come from triangulating high-dimensional point sets (for example from big data). I could well imagine that methods from homological algebra or algebraic topology could give us some insight into some properties of this data.


When I picture a Hitchhiker's Guide to Mathematics, I don't picture a book that actually teaches you mathematics. I picture a book will, on demand, give you just enough information to get by in a particular area of mathematics without blowing off a leg or something. You know, like the Hitchhiker's Guide to the Galaxy didn't actually make you an expert on a planet, just gave you some tips for having fun and just barely surviving should you find yourself stuck there.

So yeah, a big book of formulas, algorithms and mathematical structures with example applications complete with code snippets and exhaustive indexing. You wouldn't learn anything worthwhile (not Real Math and not really even applied mathematics) but it might get you out of a jam now and then.

It would take heroic effort and probably sell in the hundreds at best, it would be an immense challenge to keep the examples general enough while still being useful, but there's no a priori reason why it couldn't be done.


This is a terrific idea -- a math book that avoids the pitfalls of most math books aimed at nonmathematicians.

Such a book could cover many important mathematical ideas without necessarily lapsing into equations and overly technical explanations. For example, it should be possible to describe how compound interest works without falling into an obscure technical explanation, and understanding compound interest is very important in modern life.

Another example might explain why the stopping distance of s car is proportional to the square of the speed -- this is not well-known, and it's important for drivers to know, young ones especially.

Yet another example would explain why each member of the running sum of odd numbers is a perfect square. Expressed in words, it's not obvious that it's true or why it's true, but a picture conveys the reason immediately and intuitively: http://arachnoid.com/example/index.html#Math_Example

Just a few examples. I'm sure one could fill such a book with useful examples that would convey useful information, and make math sound like fun, or both, without being preachy or too technical.


> For example, it should be possible to describe how compound interest works without falling into an obscure technical explanation

I personally find these obscure technical (and highly abstract) explanations often far more easy to understand. I often found "easy" explanations highly illogical - not before I got the rather abstract explanations I found these explanations acceptable (and even this was not always the case - almost always the explanation for this phenomenon was that the definitions given in foundation courses could be abstracted a lot).

How can this be explained? The reason is simple: in highly abstract definitions anything that is not necessary is omitted - so there is less to think about. Additionally in this kind of definitions there is a lot more "internal logic". What does this mean? This is a little bit difficult to explain for non-mathematicians, but you can be sure that anything in the definition has a deep meaning. If this meaning seems strange to you, you can be sure that what remains to be understood often carries a deep meaning. On the other hand: when using "simple" definitions, you always have to worry whether, if something sounds strange, it is because you haven't understood it or if the "simple" explanation was simply bad.

Disclaimer: I'm a mathematician (as may be imagined). But I'm a computer scientist, too. :-)


> I personally find these obscure technical (and highly abstract) explanations often far more easy to understand.

I do, too, but I also know that nontechnical, nonmathematical people are turned off by a quick immersion in mathematical reasoning. I have a theory (not just mine by any means) that if the beauty of mathematics could be presented before the required discipline and attention to detail, we might lose fewer possible future mathematicians. As things stand, the public level of innumeracy is depressing.

> On the other hand: when using "simple" definitions, you always have to worry whether, if something sounds strange, it is because you haven't understood it or if the "simple" explanation was simply bad.

Yes, very true, one must be very careful to get it right while making it simple. I personally think a persuasive layman's explanation of something mathematical can go wrong in so many ways, and the more persuasive, the more room for error. Consider all the crazy "explanations" of quantum theory out there -- the more popular ones have no connection to reality.


I was actually thinking something just a little more technical aimed at coders only. But maybe that's the real reason it doesn't exist: no one can agree on what it is. :)


Bronstein/Semendjajew’s Handbook of Mathematics[0] is something close to what you envision. It’s helpful if you just need to look up a given formulae and understand the larger picture, but it will be incomprehensible if you don’t know at least some maths. Note that the HHGTG, while giving you some information about planets, will be useless unless you know something about hitchhiking, too: Like the Bronstein, it gives you bits and pieces to remember parts of the larger picture, but it doesn’t give the larger picture to you.

[0] http://www.amazon.com/Handbook-Mathematics-I-N-Bronshtein/dp... (apparently without an English Wikipedia article)


The syntax blocker is the biggest one for me. Are there books full of classic proofs and explanations of their syntax and shorthand? I'd be very interested.


Usually, each maths textbook defines the syntax used within at the start and then ‘rewrites’ all proofs in that particular syntax.

Since syntax – as opposed to the mathematical facts expressed thereby – is usually considered something not worth learning by heart, I doubt that you will find a book that collects proofs in their syntax of original publication and then explains that syntax.


I wrote a little "math syntax" to "english words" dictionary which might help you: http://mcgillweb.alwaysdata.net/notation_appendix.pdf

In particular the section on set theory is important (element of, subset, etc..) and the two qualifiers: for all, and there exists. Here is a 6 page primer on set notation, which contains a simple proof http://mcgillweb.alwaysdata.net/set_notation.pdf


Yes!

Grab a copy of "Proofs from the Book". It's tremendous.


Proofs from the Book is awesome. Another book in a similar vein, but requiring less of a mathematical background and targeted more at the curious layman is "Journey through Genius". Journey through Genius also spends more time giving general background about the players involved and telling the story leading up to the proof.


When you tackle a new problem the hardest part is often just figuring out what you don't know, but need in order to solve the problem. You might find a paper or book that solves a very similar problem (if you're lucky) but find you just can't understand what you're reading. This is probably because the author assumed his audience would know things you've never even been exposed to. It would be rather hard to write much of anything if authors didn't do this! However, it makes life rather difficult for "foreigners" to the discipline who don't have a good idea of the discipline's city layout and what neighborhoods they need to hang out in to find help.

The ideal solution is to get a local guide to help you, but a map of knowledge covering as much of the city as possible would be almost as helpful if it were any good. Unfortunately, mapping cities of knowledge is harder than mapping real cities!

It would be utterly fantastic if math (or physics, economics, etc.) books and publications came with associated meta-data that would tell you what dependencies are associated with what you're reading. Ideally, it should be possible to trace the map from string theory right back to counting. This map would be your guide to all the rabbit holes you dive into! It wouldn't perform magic and explain quantum physics to you in a paragraph, but it would give you an idea of how much you don't know and where you need to start.


I have to disagree to the basic assumption of the article.

Its often not necessary for a coder to understand why math works, but only how to use their results. e.g.:

I've coded fft in 5 different languages, and used it uncountable times. Still it took ages to understand why dft works, and I did not yet took the step to understand how dft leads to fft.

I know dozens of Second Life scripters (including me) who use quarternions regular, still nobody could tell me, why they work so good for 3d rotations.

I've coded my first Markov, 35 years ago, at the age of 12, and Markov was never mentioned in school at all.

I'm using dozens of machine learning methods on daily base, coded a few my self, but I do not care to know why they work, but only what their strong points and limitations are.

Its not necessary to know how to design and build a car, if you just need to drive from A to B. Its not necessary to study bio-chemistry, to be a good cook.


The article does not argue that anyone needs to do mathematics. It just attempts to explain why mathematics tends to be hard for otherwise smart and motivated programmers.

That being said, fft is a divide and conquer approach to multiplying the dft matrix by a vector, and it achieves nlogn time by taking advantage of the special structure of the dft matrix.


So could anyone suggest a nice book for a High School student (I know basic algebra, but no Calculus so far) to learn math? The school system is not very good here, so I have to do some of the learning by myself.

I have looked at Concrete Mathematics by Donald Knuth, would that be complicated to understand?


Velleman's "How to Prove It" is a great book to learn how to do mathematics.

http://www.amazon.com/How-Prove-It-Structured-Approach/dp/05...


Check out my No bullshit guide to math and physics: http://minireference.com/ [$33], it covers all of high school math, mechanics, derivatives, and integrals.

An excellent free alternative is Calculus Made Easy by Silvanus P. Thompson, which is very good and also funny http://www.gutenberg.org/ebooks/33283


That book looks amazing, I'd love to learn some Physics too, will see if I can get it.

Read the first pages of Calculus Made Easy, looked nice so far.


Concrete Mathematics should not be too hard to understand for someone who understands basic algebra. Most of the topics it covers use nothing more than arithmetic and simple logic.

That said it is not a good general mathematics book (it is designed as a Computer Science book).

A book that will help you - How to think like a Mathematician: http://www.amazon.ca/How-Think-Like-Mathematician-Undergradu...


I'd suggest that you also spend some time learning how to learn maths more effectively. That usually pays dividends over time and ought to give you a significant advantage.

Two good books on this:

How to Solve It

http://www.amazon.com/How-Solve-It-Mathematical-Princeton/dp...

Thinking Mathematically

http://www.amazon.com/Thinking-Mathematically-J-Mason/dp/020...

And if you want to get better at sitting maths-based exams, here's my own book, Exam Mastery: How to excel in maths-heavy exams :

http://www.amazon.com/Exam-Mastery-excel-maths-heavy-ebook/d...

(No need to buy though - if you want it, shoot me an email and I'll just send it over to you.)


I recommend Paul Lockhart's Measurement.


If it weren't for mathematics I would've graduated this year, instead I'm still stuck with Algebra and Calculus and what not. I don't know about you, but every time a read a theorem is like getting kicked in the nuts and then punched in the stomach. The fact is, our brains are wired to excel at a handful of things and if math it's not one of them you're out of luck. You can't teach a person to paint, or to write, because you can't teach them talent, the same is true with math. You just have to plough through and hope to get better with time.


This is completely false:

http://www.amusingplanet.com/2011/09/jonathan-hardesty-9-yea... (the original forum thread seems to be down at the moment) - but it chronicles a guy who started to learn to draw and over the course of 9 years you can see the improvement.

I started playing guitar when I was 11...did I suck then...yes...I could barley hold a full sized guitar...and over the years I have learned new techniques and practiced endlessly...and now people would say I am pretty good.

Writing and math are no different - to peg them as just "talent you are born with" deprives the subject of all the work they have done throughout their lives.


That only proves my point, from the video you can clearly see he was gifted with an artistic aptitude that he just honed over the years. I wouldn't be able to draw anything like the first drawings of that guy, not even after 10 years of practice. Practice can get you a long way, but if the seed isn't there, you just won't get that far.


Are you serious?!

The first couple of drawings are terrible, primary school level drawings - the perspective, shading etc. are all completely off.

If you honestly believe you couldn't pull something like that off given 30 minutes of dedicated drawing...well I don't believe you.

Some more examples of skills that are learned:

* Programming

* Public Speaking

* Estimation

* Throwing

* Juggling

Or are you going to start telling me that some babies are born with an innate ability to juggle? or programming?

I am not saying these skills are easy or trivial, they require thousands of hours of dedicated practice. It sounds to me you think most people are simply good at something and do that...when in fact people spend thousands of hours honing and perfecting skills.

You can't jump into the water and start swimming until you lean how to control your body through crawling, standing and lifting...with maths...you may be unable to grasp theorems because you lack basic training in propositional logic (something that is rarely taught before undergraduate unfortunately - it is so simple and provides a grounding that would make any additional mathematics teaching a whole lot easier) - what is your mathematical background? what books have you studied? what lectures/courses have you taken? Lets see if we can't fix this!


Obviously a modicum of training is always needed in most contexts, the here is that if you have an innate ability to, say, swim (which I don't by the way) you can pick it up after a few hours in the water (I've seen toddlers do it), and that's only because you have a web of neural pathways that makes you particularly apt to this specific task. Lacking that wiring you're left flailing in the water wondering why you had such a stupid idea as to get in the water in the first place.


Swimming is not innate...some people do pick it up easily, others take a couple of times, but I have never heard of anybody who is "unable to learn how to swim" and once they learn they can practice and get very good at it....there is sometimes a genetic advantage (Micheal Phelps springs to mind) - but he wasn't always nearly 7 foot swimming giant, he had to learn different techniques and practice for hour and hours a day to get as good as he is.


I said it multiple times: practice will let you improve in anything but if your genetics is playing against you, you'll face a very steep road.


Actually you started off by saying you can't teach "talent", now you are saying you can teach it but it may be hard...

Do you honestly think your genetics is playing against you in your study of mathematics? Which bits are you struggling with? What have you tried to fix it? What courses have you taken? Have you tried private tutoring? khan academy? Have you tried anything for longer than a couple of a days? weeks? How many hours of deliberate practice do you think you have put into studying mathematics?


You might not be able to draw the first drawing right now, but if you as bad at drawing as the average person, I can't imagine it taking more than a few days or weeks (max) to get to that level, with a very modest time commitment. 10 years of practice dedicated to self-improvement? You'd be extremely technically proficient by then.


It certainly took you many years to learn how to read and write, yet here you are.

It sounds like you have a grudge against basic mathematics because you think isn't important, you're afraid of it, and it affected your life in an adverse way. Unfortunately that has little to do with a human's ability to learn.

The truth is that talent means nothing in comparison with practice.


> It certainly took you many years to learn how to read and write, yet here you are.

Exactly. Most people don't realize how unnatural and difficult written language actually is, yet here we are. There's a great book on the subject called 'Proust and the Squid'[1]; it looks into the history, development and neuroscience of reading and it's quite eye-opening. There's really no reason to believe why math/music/programming/etc should be any different.

[1] http://www.amazon.com/Proust-Squid-Story-Science-Reading/dp/...


If reading and writing were once unnatural and difficult skills but now are easily picked is just because evolutions has wired our brains to learn it faster, maybe future generations will struggle less with math for the same reason.


I don't think you realize how much time you spent learning to read and write, considering that every waking moment you're surrounded by things with text on them.

Also, you have a pretty skewed idea of how evolution works. How long do you think it took for evolution to "wire our brains" to learn to write quickly?


The fact that we're continually surrounded by written text only helps explain why we haven't that much difficulty learning how to read it.

As long as the time it took for evolution to rewire our brains, I'd say more than a thousand of years, it's not like writing was invented yesterday.


What proportion of the world was literate a thousand years ago?


"Now", as in an immensely tiny blink in the human evolutionary time-scale? Sorry, that's more wishful thinking than the notion of skill acquisition. Surely you're aware of feral children, and more 'normal' dyslexics; what's the deal with those? Clearly we should be reading straight out of the womb by now, so what gives?


Feral children by definition aren't exposed to anything resembling writing and dyslexia only proves that neurological differences play a major role in our ability to comprehend and parse information.


Exactly, but dyslexics can still read, because they can learn how to. Their genetics may not have given them a good start, but evolution certainly didn't make their lives any easier either; they still had to learn things 'the hard way' like the rest of us -- just with a bit more resistance.

Difficulty and ability are orthogonal matters, but you seem to be trying to defend a hard/static correlation between them. Of course difficulty affects ability, but it doesn't limit/promote anything. As long as you keep exposing yourself to something, your brain will adapt to get more used to it (i.e. 'get better' at it). That doesn't indicate how fast you'll be at it, nor does it mean you can't start at a lower/higher point than other people (which in itself doesn't dictate whether you up superior/inferior to them) -- it just means it's possible.


I'm not saying that if you start out without being able to solve basic equations and then spend 10 years learning math you still won't be able to solve those basic equations, you will, but the effort to get there will be a power of that needed by a "math-inclined" person. And once there the math-inclined will still run circles around you.


> ...but the effort to get there will be a power of that needed by a "math-inclined" person.

'math-inclined'... so in other words, interested/experienced in math?

Um, yeah?

You don't like math, we get it. But you can't justify matters of taste 'scientifically' like that. There is no other way we can quantify these 'inclinations' you speak of other than taste, so there's nothing really new here.

Not everybody born with perfect pitch wants to be a musician, and I can guarantee you there are plenty of musicians without it that can run circles around some of those that do have it.

I'm interested in programming and music, and confidently feel I'm good at both, does that mean I'm "musically/programmatically-inclined"? If so, how might I be? Or instead, how would you define a person that isn't musically-inclined? You could argue that a literally tone-deaf person isn't, but not that !musical_inclination == tone_deafness; so how would you define it? The simplest definition that would catch the most cases here would just be the lack of interest. So in other words:

    if (interests.contains(some_subject)) {
        return POTENTIAL_SUCCESS;
    }
    else {
        return null;  // because it is 'undefined', not necessarily FAILURE
    }
Pretty novel idea there. Who would've thought you needed to be interested in something to excel in it?


> I'm not saying that if you start out without being able to solve basic equations

Well, I'd argue this is a bad place to start out. The better questions are:

    1. What do these equations represent?
    2. Why do we care about what they represent?
    3. What does "solving them" mean for the things they represent?
So, maybe a more important muscle to exercise is the one that lets us go up and down the ladder of abstraction. A huge part of mathematics is seeing a class of (relatively) concrete things, finding commonalities and drawing out abstractions, reasoning about the abstractions, and then drawing conclusions about the more concrete things.

You can do so much of this without the formalisms of classroom mathematics and, honestly, it's what "real" mathematicians do. The formalisms are just a high-bandwidth way of communicating these ideas between mathematicians. They're just a highly technical map of the terrain, not the terrain itself.

In fact, if I was teaching someone and they invented their own bizarro syntax for "solving" equations I'd be ecstatic. Whatever it was that happened in their head that made them

> but the effort to get there will be a power of that needed by a "math-inclined" person.

This is fine and true, perhaps, but if so it's true of all things, not just mathematics. You can replace "math" with "writing" or "running" or "playing a kazoo."

What do you think makes mathematics special?



Reading and writing are basic skills that could never be compared to anything even remotely resembling a subject like mathematics. And I don't think math isn't important, conversely I think it's the most important discipline of all, a pillar of modern society. But I also acknowledge that it's also the preserve of a selected few (like chess for instance), read my other responses for a more detailed view of my point.


It doesn't take a grandmaster at chess to play the game. Just as well, there are varying levels of understanding and producing mathematics.

You think of language as a basic skill because you have taken it for granted your whole life, but the same kinds of mental abilities used for language are what is needed for mathematics. There are heaps of scientific evidence supporting this. [1]

Again, you are putting mathematics on a pillar because it's foreign to you, not because of any inherent difficulty in the subject itself. You're "acknowledging" a myth as fact.

[1] http://www.amazon.com/The-Math-Gene-Mathematical-Thinking/dp...


"You can't teach a person to paint, or to write, because you can't teach them talent, the same is true with math."

Never a falser word spoken.


So you can teach talent.


No, but talent does matter a lot in the last mile. Barring any physical or mental handicaps, however, I'd say it matters very little in the first few miles. I'd even go so far as to say this is true of almost any thing one might choose to get good at, not just mathematics.

To put on my logician's hat for a second, the great grandparent said, "You can't teach a person to paint, or to write, because you can't teach them talent, the same is true with math." The unstated assumption in this sentence is that one must be talented at, say, writing in order to learn how to write. Of course, this depends entirely on what one means by "writing", "talented," and "learning."

If "being able to write" means being able to write as well as the best writers do then, yes, I'd agree that takes talent. If "being able to write" means having a command of written language above the mere mechanical act of putting words on a page, e.g., humor, irony, structure, flow, etc., then no, I don't think that requires talent. It requires interest and deliberate practice, but I don't think it requires talent.

This word "talent" is kind of a boogey man, anyhow. On the one hand folks plenty of folks say I'm "talented" at math. On the other hand, there were years in college where I spent 40+ hours a week studying it.

I think anyone who chooses could become a decent at mathematics, swimming, playing the guitar, or baking pies if they really wanted to, regardless of talent.


you point is valid but there's a difference between writing and learning math: while you can't without a doubt learn the basics of putting words on a page, you can't just learn enough math to write/understand a proof. A proof is reasoning put into writing, there's no mathematical equivalent of "putting words on a page" that lets you come up/comprehend one.


Well, perhaps we disagree, but I don't think mathematics is about "proofs". I think proofs are a by-product of folks doing mathematics. They're the output of a long chain of mathematical thinking and can take many forms. I would agree that being able to follow contemporary mathematical writing is important, but one should learn the formalisms after they have a grounding in mathematical thinking.

The formalisms aren't the point. For example, I wouldn't stress out if a student couldn't follow Newton's * Principia Mathematica*. I've tried and I can't, really.

This is why the way we introduce "proofs" in high school in the US (via tedious and seemingly arbitrary geometric arguments) is so destructive. It's all formalism and, honestly, very little mathematics.

I can and have explained the Pythagorean theorem to folks with virtually no mathematical training. The only prerequisite is the ability to find the area of a rectangle and perhaps a little but of high school algebra.

We're not talking about diagram chasing, here.


Yes:

http://www.amazon.com/Talent-Code-Greatness-Born-Grown/dp/05...

Success/fame/fortune/happiness/etc are another matter, but skill acquisition has been shown to be generally straightforward. Some things may have higher learning curves for some people (e.g. math for a person with dyscalculia[1]), but that doesn't mean it's impossible.

[1] http://en.wikipedia.org/wiki/Dyscalculia


Now I don't wanna be that guy and I hate it to break it to you, but the author isn't a behavioral psychologist or a social scientist. He graduated from journalism school, he's in the business of writing books and his credentials for tackling such an argument are squat.

Hoping that practice alone in any given subject will get you wherever you want, is just wishful thinking. Think of all the kids that play piano, or write poems or generally do something great at a very young age. You can ascribe their skills to practice or experience, given their age, you just have to acknowledge that it's talent, an innate ability. They don't learn how to play piano, they just get it. The same is true with math people, their brains are wired to comprehend a complex subject, arguably impenetrable for most of the population.


This argument would be great if you actually had any substance to back up your claims, without it you're not really arguing against anything. Perhaps I linked to a book written by a journalist because it's more accessible, and thus more helpful/actionable than published scientific papers? If math is difficult to grasp for some people, why would journal articles be any better? Not to mention that maybe the book itself has professional/scientific citations in case you were really interested in learning more?

Neuroscience supports the ability to acquire skills. The combination of epigenetics, GMOs, neuroplasticity, and many other advances make it even more apparent that adjusting ourselves is possible; even without a 'free will'[1].

If people don't feel like trying to learn/improve, that's fine; but they shouldn't try going around loosely justifying their actions with 'science' just because it makes themselves feel better.

[1] I only bring this up because determinism seems to be the biggest argument against this for some reason, but it really doesn't counter anything. Hell, I actually agree with it, and have written on the matter: https://news.ycombinator.com/item?id=6235628


This seems like a resonable place to mention that years ago a guy called Tom Henderson in Portland put up a kickstarter project for a Punk Mathematics book, which I funded, but he has taken $30,000, completely disrespected his funders and failed to produce anything at all. If you meet the guy, do (verbally) thump him for me. http://www.kickstarter.com/projects/1541803748/punk-mathemat...


> My programs absolutely reeked of programming no-nos. Hundred-line functions and even thousand-line classes, magic numbers, unreachable blocks of code, ridiculous code comments, a complete disregard for sensible object orientation, negligence of nearly all logic, and type-coercion that would make your skin crawl.

It probably would have been easier to start on a language that's procedural and maybe dynamically-typed. There are fewer traps to fall into.


Great read, Khan academy does solve a lot of the headaches that come with Math, thank you Salman Khan! if it wasn't for him I wouldn't have gotten a 3.8 in Calculus.

for those who don't know math or want to learn this will make your life easier, https://www.khanacademy.org/


I am not a big fan of 'black screen with voice over' format. There are much much better alternatives for calculus these days, for example calc1 course on coursera is excellent.


On the contrary I prefer his method of teaching more than anything else, it can be quite distracting when there is a face or hand movements on the screen.


Excellent read. Thanks for this.


> It is an interesting pedagogical question in my mind whether there is a way to introduce proofs and the language of mature mathematics in a way that stays within a stone’s throw of computer programs. It seems like a worthwhile effort, but I can’t think of anyone who has sought to replace a classical mathematics education entirely with one based on computation.

...Curry–Howard correspondence?


Is quite a large stone's throw away from both computer science and standard mathematical proofs.


I was hit hard in the face by a segmentation fault. It took hundreds of test cases and more than twenty hours of confusion before I found the error: I was passing a reference when I should have been passing a pointer.

That code wouldn't compile, if a pointer is expected you can't just pass a reference, hello static typing. Furthermore, you can't explicitly "pass a reference" to begin with, as in there is no syntax in the caller to signify "this shall be passed as a reference", that would be something that is in the signature of the CALLEE. So his "error" literally makes no sense as it cannot happen. If what he means is that he wrote the callee wrong, that doesn't make any sense either, because the whole point is that you can't reseat (make to point to something else) a reference. So you would know if you needed to use a pointer instead the minute you tried to reseat a ref type.

Also, anyone who takes 24 hours to debug a segfault needs to pick another profession and quickly.


If you have never taken 24 hours to debug a segfault, you haven't done enough C/C++ programming.

E.g. a rare race condition causes some memory to be freed while there's still a live pointer to it, but THAT doesn't cause a segfault immediately because the memory is immediately reused for some other data structure (in a different thread), so when the dangling pointer is accessed it points to allocated memory that's being used for some other purpose. Some field in the data structure the dangling pointer references is mutated, and then finally in another thread that same memory is accessed via a different pointer (the one that was recently allocated) and part of it is treated as a pointer, and dereferencing THAT causes a segfault.

(Oh and by the way, in real life things can get much more complicated than this before the program blows up with a segfault. In fact, you're lucky if there's a segfault, instead of finding out months down the road that your data has been subtly wrong the whole time.)

Ultimately the solution to this is some iterative combination of swapping in a guarded malloc, then trying things in Valgrind, then trying various Valgrind options, then setting some hardware watchpoints in gdb, then finding that the race condition disappears in the debugger, then adding log statements all over the place, then statically analyzing the code for hours while pulling your hair out, then finally by accident noticing that padding the first data structure with an extra field makes the problem go away, then setting the right hardware watchpoint that actually helps, etc, etc, then piecing together a real story for what's happening, and ultimately making a 2-line fix.

If you think you're somehow "above" this kind of horrible drudgery, and that your immense programming skill will let you avoid it forever, then you are just not very experienced (or you're not working on hard problems).


These things are true. Things do quickly get complicated when you're trying to replicate race conditions. What you described, though, is not trying to "fix a segfault" it's trying to "fix a race condition" where the fault is just a symptom. In the example in your OP there was none of that. Even assuming what he meant was that he accidentally mutated a pointer he passed by reference instead of value which caused problems after he returned and things fell out of scope or were freed, you could debug that in 5 minutes just by setting a watch breakpoint on the pointer to see where it was last changed before the errant access. An hour, tops, if that didn't occur to you to begin with.

Tracking down multi-threaded memory corruption is tough, but those bugs are going to be rare. Threading is a tough problem to begin with, but it's certainly not a fault of C/C++. And whatever this guy was describing should not have taken 'hundreds of test cases and over 24 hours'.


Why do you say "should not have taken?" Given the fact that he was a student at the time, learning C++, I think it's perfectly understandable that he ran into a "bang your head against the wall" type problem. Maybe it was his first real segfault debugging session. I doubt you solved your first real segfault in 5 minutes.

Sure, I'd expect someone with a couple years' experience to be able to debug a simple segfault relatively quickly. But the only reason they'd be able to debug it quickly is because they've _painstakingly done it before_, and can draw from that experience.


This is probably one of the least-important parts of the article. It was a generic example of a situation where you spend a long time trying to solve an error based on something relatively trivial. There is no reason to call him out on this portion of the post as it doesn't pertain to the point of the article, and just makes you look sort of dickish.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: