(An entire section in the article is devoted to this point.)
Programming languages and tooling have attracted phenomenal amounts of attention and iteration in just a few decades. Folks hereabouts just about go to the damn mats over the smallest syntactic quibbles.
Meanwhile, over in mathematician land, they're limping along with a write-optimised mishmash of "the first thing that a genius in a hurry scribbled three hundred years ago and now we're stuck with it".
I have very limited attention and brainpower and remembering what ∂ means in this book vs the other twenty places it's used in my library and where it means something completely, entirely, utterly different, is just hard. So too is remembering what Q(x) and W(z) mean for two hundred densely-written pages of stuff optimistically described as accessible to anyone who has taken highschool algebra (not mentioned: and remembering it exactly).
I think this is an exaggeration/strawman. Can you give an example of a mainstream introductory math book that defines some esoteric notation like "Q(x)" and then continues using it hundreds of pages later? Note that I don't mean defining it locally within the context of one proof -- that's something different.
The closest thing I can think of is notations like U(f, P) and L(f, P) for the upper and lower Riemann sums of a function f with respect to a partition P. These do show up in introductory real analysis textbooks, but they're pretty basic/fundamental notions.
By the way, as far as I know, ∂ in introductory textbooks can only mean either "partial derivative" or "boundary". Do you have examples of it meaning other things?
Most of what I've seen lately is queueing theory, markov processes or statistics, and yes, I have seen notation used for hundreds of pages at a time and I've seen these symbols used to mean different things (not including partial differential).
One of the books I read recently[0] has a table of where symbols and formulae are first defined. It's a godsend.
But I would prefer self-describing names to a lookup table. I don't put comments at the top of each file of code with such a table.
Math is hard, and reading it is slow for even ggood mathmeticians. However, it does get easier with practice, as one learns the conventions and norms. It is not self explanatory; aliens would not glance at a notebook and immediately know that it is a proof of the cardinality of the set of prime numbers, for example.
(An entire section in the article is devoted to this point.)
Programming languages and tooling have attracted phenomenal amounts of attention and iteration in just a few decades. Folks hereabouts just about go to the damn mats over the smallest syntactic quibbles.
Meanwhile, over in mathematician land, they're limping along with a write-optimised mishmash of "the first thing that a genius in a hurry scribbled three hundred years ago and now we're stuck with it".
I have very limited attention and brainpower and remembering what ∂ means in this book vs the other twenty places it's used in my library and where it means something completely, entirely, utterly different, is just hard. So too is remembering what Q(x) and W(z) mean for two hundred densely-written pages of stuff optimistically described as accessible to anyone who has taken highschool algebra (not mentioned: and remembering it exactly).
Notational ergonomics matter, dammit.