It certainly is more painful, but it is more beneficial. It is also harder to teach, but I stand by my claim.
I'll quote Poincare:
Math is not about the study of numbers, but the relationships between them.
The difficulty and benefit of the rigor is the abstraction. Math is all about abstraction.
The abstraction makes it harder to understand how to apply these rules, but if one breaks through this barrier one is able to apply the rules far more broadly.
----
Let's take the Fundamental Theorem of Calculus as an example[0]:
f'(x) = lim_{h->0} {f(x + h) - f(x)} / {h}
Take a moment here and think about it's form. Are there equivalent ones? What do each of these symbols mean?
If you actually study this, you may realize that there are an infinite number of equations that allow us to describe a secant line. So why this one? Is there something special? (hint: yes)
Let's call that the "forward derivative". Do you notice that through the secant line explanation that the "backward derivative" also works? That is
f'(x) = lim_{h->0} {f(x) - f(x - h)} / {h}
You may also find the symmetric derivative too!
f'(x) = lim_{h->0} {f(x + h) - f(x - h)} / {2h}
In fact, you see these in computational programs all the time! The symmetric derivative even has the added advantage of error converging at an O(n^2) rate instead of O(n)! Yet, are these the same? (hint: no)
I'm betting that most classes that went through deriving the derivative did not answer these questions for you (or you don't remember). Yet, had you, you would have instantly known how to do numerical differentiation and understand the limits, pitfalls, and other subjects like FEM (Finite-Element Methods) or Computational Methods would be much easier for those who take them.
----
Yet, I still will say that this is much harder to teach. Math is about abstraction, and abstraction is simply not that easy. But abstraction is incredibly powerful, as I hope every programmer can intuitively understand. After all, all we do is deal with abstractions. One can definitely be overly abstract and it will make a program uninterpretable for most, but one also can make a program have too little abstraction, which in that case we end up writing a million variations of the same thing, taking far more lines to write/read, and making the program too complex. There is a balance, but I'd argue that if one is able to understand abstraction that it is far easier to reduce abstraction than it is to abstract.
This is just a tiny taste of what rigor holds. You are absolutely right to be frustrated and annoyed, but I hope you understand your conclusion is wrong. Unless you're Ramanujan, every mathematician has spent hours banging their head against a literal or metaphorical wall (or both!). The frustration and pain is quite real! But it is absolutely worth it.
You are arguing for rigor, not for its didactics. Those are different.
> had you, you would have instantly known how to do numerical differentiation and understand the limits, pitfalls, and other subjects like FEM
No, you wouldn't. You would also learn things out of order. You would be exposed to things without understanding why you are learning them. People who argue this usually learn things the intuitive way (whether from rigorous material or not - what goes on in their mind isn't rigorous), and then they go back and reassess the rigor in the light of that. Then they pretend that they learned from the rigorous exposition. No, they didn't.
It is totally fine to iterate. Learn non-rigorously. Go back to it and iterate on rigor later. As it becomes necessary, and if it ever becomes necessary for your field.
> Unless you're Ramanujan, every mathematician has spent hours banging their head against a literal or metaphorical wall
Particularly if you are learning from "rigorous" material. But then you go watch some YouTube videos to make up for the absence of didactics in your textbook.
I mean, why don't we just throw Bourbaki books at freshmen and let them sort it out without classes? They are maximally rigorous, therefore maximally great to learn from, right?
> I mean, why don't we just throw Bourbaki books at freshmen and let them sort it out without classes?
The Bourbaki group was quite famous for wanting to restructure math education. Teaching many things that are considered advanced to children. Despite not sticking around we see elements of resurgence and effectiveness.
So I'm not sure your argument of "out of order" is accurate. The order is what we make. There's no clear optimal way to teach math. Your argument hinges on that. You might argue that the current status quo is working, so why disrupt it, and I'll point around asking if you really think it's so effective when many demonstrate a lack of understanding all around us. That so many struggle with calculus is evidence itself. We need not even acknowledge that there are many children who learn this (and let's certainly not admit that it's far more common for them to learn it in unconventional ways).
> let them sort it out without classes?
To suggest I'm arguing for the elimination of educators is beyond silly. I'd hope the caliber of your arguments would match that of your diction.
The Bourbaki group was reponsible for the Mathématiques modernes movement which intended to reform basic education on rigorous math. This may be the world's best documented educational failure in history. It was tried in several countries, so the sample size isn't one. All evidence points at rigor as a terrible educational tool.
The point of Math education isn't to create a clone army of the most pedantic pure Mathematicians in Academia.
> if you really think it's so effective when many demonstrate a lack of understanding all around us. That so many struggle with calculus is evidence itself
New Maths often called a failure but there's not real clear evidence that's the case. This is especially true in America where criticism was quite high as parents were quite vocal and struggled to help their kids (remind you of a different movement?).
There is a certain irony with that too. The major criticisms were about the de-emphasis of arithmetic, arguing that this is the skill people need more. Here we are today though, where everyone has a calculator in their pocket. Plus, people still struggle with extremely basic arithmetic such as calculating a 10% tip... SAT scores are commonly pointed to as evidence, but in the same period of time Verbal skills fell even more[0]! Would you suggest that New Maths was so bad that it made people worse at English? Or maybe there's another explanation like how the number of SAT takers dramatically increased, especially from those in the bottom percentile of high school[1,2].
If you explore the NAEP data[2] there's no decline and there's been progress the entire time for math scores for ages 9, 13, and 17 (the ages that participated).
The Russians famously did very well too.
It would be even a greater irony if we concluded something was a failure because we made mathematical errors in analyzing...
Let's take the Fundamental Theorem of Calculus as an example[0]:
f'(x) = lim_{h->0} {f(x + h) - f(x)} / {h}
This isn't the Fundamental Theorem of Calculus, it's the usual definition of the derivative of a function of a single real variable. The Fundamental Theorem of Calculus establishes the inverse relationship between differentiation and integration [0].
Unless you're Ramanujan, every mathematician has spent hours banging their head against a literal or metaphorical wall (or both!)
Ramanujan was no stranger to banging his head against the wall. My impression from Kanigel's The Man Who Knew Infinity is that his work ethic and mathematical fortitude were as astonishing as his creativity. For much of his career, he couldn't afford paper in quantity and did his hard work on stone slate, only recording the results. This could make it seem like his results were a product of pure inspiration because he left no trace of the furious activity and struggle that was involved.
From The Man Who Knew Infinity:
When he thought hard, his face scrunched up, his eyes narrowed into
a squint. When he figured something out, he sometimes seemed to talk to
himself, smile, shake his head with pleasure. When he made a mistake,
too impatient to lay down his slate pencil, he twisted his forearm toward
his body in a single fluid motion and used his elbow, now aimed at the
slate, as an eraser.
Ramanujan's was no cool, steady Intelligence, solemnly applied to the
problem at hand; he was all energy, animation, force.
I'll quote Poincare:
The difficulty and benefit of the rigor is the abstraction. Math is all about abstraction.The abstraction makes it harder to understand how to apply these rules, but if one breaks through this barrier one is able to apply the rules far more broadly.
Let's take the Fundamental Theorem of Calculus as an example[0]: Take a moment here and think about it's form. Are there equivalent ones? What do each of these symbols mean?If you actually study this, you may realize that there are an infinite number of equations that allow us to describe a secant line. So why this one? Is there something special? (hint: yes)
Let's call that the "forward derivative". Do you notice that through the secant line explanation that the "backward derivative" also works? That is
You may also find the symmetric derivative too! In fact, you see these in computational programs all the time! The symmetric derivative even has the added advantage of error converging at an O(n^2) rate instead of O(n)! Yet, are these the same? (hint: no)Or tell me about the general case of
I'm betting that most classes that went through deriving the derivative did not answer these questions for you (or you don't remember). Yet, had you, you would have instantly known how to do numerical differentiation and understand the limits, pitfalls, and other subjects like FEM (Finite-Element Methods) or Computational Methods would be much easier for those who take them. Yet, I still will say that this is much harder to teach. Math is about abstraction, and abstraction is simply not that easy. But abstraction is incredibly powerful, as I hope every programmer can intuitively understand. After all, all we do is deal with abstractions. One can definitely be overly abstract and it will make a program uninterpretable for most, but one also can make a program have too little abstraction, which in that case we end up writing a million variations of the same thing, taking far more lines to write/read, and making the program too complex. There is a balance, but I'd argue that if one is able to understand abstraction that it is far easier to reduce abstraction than it is to abstract.This is just a tiny taste of what rigor holds. You are absolutely right to be frustrated and annoyed, but I hope you understand your conclusion is wrong. Unless you're Ramanujan, every mathematician has spent hours banging their head against a literal or metaphorical wall (or both!). The frustration and pain is quite real! But it is absolutely worth it.
[0] Linking an EpsilonDelta video that covers this exact example in more detail https://www.youtube.com/watch?v=oIhdrMh3UJw