This is a great example of one of those things in abstract maths that is hard to follow when you learn it, but once you’ve been through it a few times and learnt the definitions to heart it’s really just a rephrasing of the definitions.
Trivial in math is a term that refers to anything you've already learned.
You sometimes hear people say that math is tautological. But regardless of whether it's all just an elaborate rephrasing of the axioms, it's quite beautiful.
Historically, mathematicians have spent a huge amount of time and effort formulating optimal axioms and foundations so that theorems would follow naturally from structure. Theorems following “trivially” from a theoretical framework that took years to develop isn’t an indictment of the theorem, but an endorsement of the incredible effort expended to develop an optimal context for expressing and understanding the theorem.
Half of the work of mathematics is in correct definitions. Groethendieck referred to the division between mathematical labors as hunting and farming.
This is not my most popular opinion, but probably the most consequential invention of the last 400 years was the set. Suddenly all mathematical knowledge could be verified in one framework. Physicists had a target in which to state their models.
If you could state your hypothesis in the language of mathematics, "everyone" knew exactly what you meant by it, and how to go about testing your claims, or proving them, if they happened to be about mathematics itself.
Calculus was invented in 1690ish, physicists like to claim that this was the most important advance in physics, but quantum mechanics and relativity didn't happen until dedekind invented the real numbers, 200 years later.
It turns out that knowing what you're talking about matters.
Do you have any evidence that Dedekind's formalization of the real numbers was an essential step on the road to quantum physics and relativity? This seems very doubtful to me.
A more plausible claim: the general move towards greater rigour in mathematics, one of whose expressions was Dedekind's formalization of the real numbers, improved the state of mathematical understanding in ways that were necessary for the arrival of quantum physics and relativity. E.g., to do quantum physics you want the notion of "vector space"; to do general relativity you want the notion of "Riemannian manifold"; to do special relativity maybe you want to have encountered the "Erlangen programme".
But I'm not 100% convinced. It's not unusual for physicists to make use of mathematical notions that they don't have precise definitions of. E.g., I'm not sure anyone has an entirely satisfactory formal account of "path integrals"; string theory may or may not turn out to have anything to do with how the universe actually works, but if it doesn't it probably won't be because we don't have a complete account of what it actually is. Newton managed to do pretty impressive things with calculus before anyone had a really convincing definition of such advanced notions as, er, "derivative".
the best I can do is assure you that it is informed by experience. Calculus could not progress without analysis. einstein needed poincare, who needed topology, which needed sets.
> Trivial in math is a term that refers to anything you've already learned.
According to a professor, "trivial" means: "If this is not trivial for you, you should see this as a clear signal that you should take this course seriously instead of slacking of, or even that you simply are in the wrong course."
It's actually an interesting observation. If you know where your keys are, finding them is trivial, but if you don't, then even the refrigerator becomes plausible.
Math does feel like that a lot of the time. Once you've tree-searched proof space and found the connection, you can usually spend way less time proving it the next time around.
My pet peeve math term is "clear". A long time ago I thought could teach myself group theory by buying the Springer group theory book and reading it from chapter 1, 1 page at a time. But I was blocked within the first 5 pages because the axioms and first few proofs kept saying how "clear" it was that all the results followed. Unfortunately, it was not "clear" to me :(
I had a calc prof who was in the middle of a lecture, "...and as any fool can see, X is..." He stopped, turned around, and said, "You know, sometimes when I say, 'It is intuitively obvious', or, 'As any fool can see', I realize it may not be intuitively obvious, and any fool may not be able to see. But as any fool can see, X is..."
I am skeptical that this uniquely identifies a book (unless you mean the book "Linear Algebraic Groups" by the author called Springer, rather than the publisher called Springer, in which case it's definitely not the way to start learning group theory!).
It was the yellow Springer publishing book. Happened 20+ years ago now, cannot recall the author. IIRC the title was "a course in the theory of groups".
this is a standard thing in "mature" areas of math and it's absolutely the opposite of what's good for the student (all of the machinery being hidden in the definition instead of developed in the theorem's proof).
EDIT: if you hate "a monad is a monoid in the category of endofunctors" then you also hate "definitions should be hard and theorems easy".
My first instinct is to agree, but I'm not sure actually. What I really want when learning a new area of math is the full motivation for the tricky definition, taking as much time as needed to follow the dead ends of easier but worse definitions. Then I get the whole picture. IMO the motivation is the key thing for students, not the definition being easy.
Though maybe the way this course would work is in fact by proceeding through a series of easy but explicitly flawed definitions, and proving both real results and nonsense from them, so you see why the real definition is justified.
> What I really want when learning a new area of math is the full motivation for the tricky definition
yes so then you want proofs that actually exercise real machinery instead of playing the shell game of "an X is a Y and a Y is a Z, and has ABC properties, there for X has ABC properties"; you want a proof that goes through the process of using properties ABC to build Y from Z and X from Y (or something akin to that).
definitions aren't for people learning math, they're for people using math ie practising professional mathematicians that are proving more theorems; Hausdorff didn't invent "Hausdorff spaces", he used/worked with various properties of topological spaces and then when the next person came along and needed to right another paper on top, that person invented "Hausdorff space".
It’s interesting that you use the “a monad is a monoid in the category of endofunctors” example. That kind of statement is definitely hard to parse when you’re trying to learn concepts.
However, the more I’ve learned about category theory, the more I’ve understood it as a way of defining what things are and what properties follow from those definitions.
Like, a monad really doesn’t have meaning beyond “monoid in the category of endofunctors”. The same is true for monoids and endofunctors: it’s all about the properties of those objects.
In the context of programming, we can impose all kinds of meaning, but the definitions and laws are really what makes it all work when you piece it together.
I guess my approach is to suffer through it until some understanding is gleaned, which admittedly isn’t very satisfying or easy haha.
Yeah, as an example this is a simple corollary of the following result which gets used a lot: the quotient of a (unital, commutative) ring by a maximal ideal is always a field.
If your ring has only two ideals then the trivial ideal is maximal, and thus your ring is already a field!
The more you know, the more "shortcuts" you start seeing, I guess.
>
How to distinguish mastery of a complex subject from parsing one formally expressed in a complex way?
In my opinion: the difference between a complex subject and one formally expressed in a complex way is that in the former, the results that you get are really deep (understanding them at the end feels like a spiritual experience).
Knowing the rules of chess doesn’t make you a chess master. Knowing the syntax and semantics of a programming language doesn’t make you a master software architect.
It depends what you include in your definition of "parsing". For example, the chess master looks at the board and "parses" it in a very different way to the amateur player: he includes his knowledge of thousands of games played, analysis of potential future outcomes, and so on.
I disagree. "Parsing" is the first level of understanding. If you are not moving past the parsing level, you have not achieved any kind of mastery.
My experience is that mastery means more like "you have a mental model which gives you 'intuitive' reasons to accurately classify things as true/false and provides some motivation for the reasoning".
An example: you see someone has solved a degree 4 equation by repeatedly applying the quadratic equation, getting 8 solutions. "No way."
Another example: watch a famous baking show and you see somebody put a bunch of different sized pieces of bread in the oven at the same time. Right away: "aren't they going to cook at different rates?" Sure enough, some burned, some raw.
It depends a bit on your tastes. The content falls under the broad umbrella of Abstract Algebra, more specifically Ring Theory, or perhaps Field Theory if you squint a bit. Those are your keywords.
"Applications of Abstract Algebra with Maple and MATLAB" by Klinger, Sigmon, and Stitzinger is apparently good for those with an engineering background: https://www.maplesoft.com/books/details.aspx?id=624.
If you're committed, then any introductory text on abstract algebra or group theory might capture your interest.
I would recommend starting with applications or something as close to your wheelhouse as possible just to stay motivated. Abstract algebra, in particular, is known for requiring quite a lot of machinery before obviously connecting with other things, which can feel like an onslaught unless you're inherently interested.
Have fun though! It's really one of the deepest subjects in modern math, IMHO. Almost every field has been affected by it's results.