# Math is confusing even for those who are good at it

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com

SourceMany people who are in this position, trying to learn mathematics on their own, have roughly two approaches. The first is to learn only the things that you need for the applications you’re interested in. There’s nothing wrong with it, but it’s akin to learning just enough vocabulary to fill out your tax forms. It’s often too specialized to give you a good understanding of how to apply the key ideas elsewhere. A common example is learning very specific linear algebra facts in order to understand a particular machine learning algorithm. It’s a commendable effort and undoubtedly useful, but in my experience this route makes you start from scratch in every new application.

The second approach is to try to understand everything so thoroughly as to become a part of it. In technical terms, they try to grok mathematics. For example, I often hear of people going through some foundational (and truly good) mathematics textbook forcing themselves to solve every exercise and prove every claim “left for the reader” before moving on.

This is again commendable, but it often results in insurmountable frustrations and quitting before the best part of the subject. And for all one’s desire to grok mathematics, mathematicians don’t work like this! The truth is that mathematicians are chronically lost and confused. It’s our natural state of being, and I mean that in a good way.

Andrew Wiles, one of the world’s most renowned mathematicians, wonderfully describes research like exploring a big mansion.

You enter the first room of the mansion and it’s completely dark. You stumble around bumping into the furniture but gradually you learn where each piece of furniture is. Finally, after six months or so, you find the light switch, you turn it on, and suddenly it’s all illuminated. You can see exactly where you were. Then you move into the next room and spend another six months in the dark. So each of these breakthroughs, while sometimes they’re momentary, sometimes over a period of a day or two, they are the culmination of, and couldn’t exist without, the many months of stumbling around in the dark that precede them.

Wiles discusses what it’s like to do mathematical research here, but I’d argue that the same analogy largely holds for learning mathematics.

December 12, 2018 7:50 pm

From lawrence on Object Oriented Programming is an expensive disaster which must end

"Jussi Nurminen, thank you for writing. I believe you are correct, in the sense that Python 2.x had all the bas..."