Hacker News new | past | comments | ask | show | jobs | submit login
Why are mathematicians so bad at arithmetic? (2017) (mathwithbaddrawings.com)
110 points by tomerv on Feb 4, 2022 | hide | past | favorite | 124 comments



How can you write such an article without mentioning the Grothendieck prime?

https://www.ams.org/notices/200410/fea-grothendieck-part2.pd...:

One striking characteristic of Grothendieck’s mode of thinking is that it seemed to rely so little on examples. This can be seen in the legend of the so-called “Grothendieck prime”. In a mathematical conversation, someone suggested to Grothendieck that they should consider a particular prime number. “You mean an actual number?” Grothendieck asked. The other person replied, yes, an actual prime number. Grothendieck suggested, “All right, take 57.”

But Grothendieck must have known that 57 is not prime, right? Absolutely not, said David Mumford of Brown University. “He doesn’t think concretely.” Consider by contrast the Indian mathematician Ramanujan, who was intimately familiar with properties of many numbers, some of them huge. That way of thinking represents a world antipodal to that of Grothendieck. “He really never worked on examples,” Mumford observed. “I only understand things through examples and then gradually make them more abstract. I don’t think it helped Grothendieck in the least to look at an example.


For those who can read French, probably as good a place as any to mention Grothendieck's journal Récoltes et semailles recently got published by Gallimard: https://www.gallimard.fr/Catalogue/GALLIMARD/Tel/Recoltes-et...


Maybe Grothendieck, like Avicenna's conception of God, only knew particulars "in as much as they are universal".

Grothendieck once flunked an exam due to—in his own words— "une erreur idiote de calcul numérique".


Yeah a lot of pure mathematics education very much emphasizes the learning the "spirit" or fundamental concepts of certain things rather than computation, sometimes to the extreme when compared even with applied mathematics. This is especially obvious in anything that even approached calculus.

In the math courses I took in college, if we e.g. ever had a differential equation we needed to solve on a problem set "looked up the answer on Wolfram Alpha" was a perfectly reasonable response. Same for differentiation or integration. In some intro courses you'd be ask to prove that certain differentiation or integration techniques had mathematically rigorous backing, but never had to do the rote work of actually memorizing and using those techniques ever again. Again "looked up the answer on Wolfram Alpha" was a perfectly reasonable response. It was a far cry from the applied mathematics department.

Another huge discrepancy was in linear algebra, where very little time was spent on matrix computation other than again proving that matrix operations and invariants preserved properties of linear maps and vector spaces and almost all the time of the course was spent on the linear maps and vector spaces themselves, whereas linear algebra in the applied math department was almost entirely about matrices and the intricacies of various matrix computations and decompositions and subjects like dual spaces or other things that couldn't be represented with normal matrix computation (e.g. infinite dimensional vector spaces) were omitted.


Once I got past calculus, it seemed like we stopped caring about numbers. Plenty of 0s and 1s, the occasion 2, but rarely a 3 or higher. I remember one theorem had a 24 in the derivation (Stone-Weierstrass, I want to say?) and we were all amazed.

Given that, who needs arithmetic?


To add to this comment, a lot of times we didn't even care what the solution was, only that it existed, was unique, or was bounded by some constant (and we didn't usually care about the value either). The more you advance into pure mathematics, the less you care about specific functions or values.


From what I know, this happened in a large part due to the Three Body Problem. Once it was proved that there was no closed form solution in terms of elementary functions, Poincare suggested that we instead try to see if solutions existed at all, and whether they were unique. If so, we could at least hope for numerical solutions. I think this was partly responsible for the development of "analysis situs", i.e. topology. [1]

This approach was fruitful in yielding concrete techniques - we now have efficient techniques like the fast multipole method by Rokhlin and Greengard, to numerically tackle the three body problem. Without existence and uniqueness results, there is no basis for the correctness of these algorithms.

[1] https://en.wikipedia.org/wiki/Poincar%C3%A9_and_the_Three-Bo...


And I suspect, this is what makes math grind its own gears.

The way math is "built", with too generalizing theorems, non-constructive statements, etc makes it fall into traps like the Russel's Paradox (which literally cannot exist "in real life" - not even for the very wide definition of mathematical real life).

Building math on saying "this exists" without actually building towards what it is, is a bit of a "sin".


You aren't very explicit what you mean, but I disagree I think.

The main thing is willing things into existence (like ZFC, the Axiom of Choice) is equally consistent with not having it (ZF + not AC) or not deciding it (ZF) as shown by forcing.

Russel's Paradox in particular was worked around by not allowing to "build sets" quantifying over all sets (see axiom schema of specification).

If you are talking about avoiding "unprovable statements" (i.e. Godel's incompleteness theorem), you have to strip things way back further: it applies to systems that only have e.g. Peano arithmetic. An actual practical statement that is unprovable in Peano arithmetic is Goodstein's theorem: - write out a number in its base 2, including its exponents (i.e. recursive) - replace all the 2s with 3s - subtract 1 - write out a number in base 3, including its exponents - replace all the 3s with 4s - subtract 1 - ... The question is whether for all starting numbers this sequence eventually ends at 1. The answer is yes, trivially if you use ordinals (and replace all your base numbers with omega: the +1 will basically do nothing and -1 means it must be finite because ordinals well-ordered), but this cannot be used in Peano arithmetic.

Fundamentally, like Paris-Harrington Theorem, I think the way to think about why this cannot be proved, is because these sequences get big before they end up at 1.


A better example is the "real" numbers, which do not physically exist but are a constant source of theorems that don't apply to physics, and this is taken as proof that real numbers are a fascinating complex object, instead of a poor foundation for analysis. Standard Mathematics does not use a good model for infinitesimals, which should not be allowed to be densely non-differentiable.


Yep, this is a good example. And maybe it helps illustrating my point

On N and Q we can come up with any number we want. Even though there are infinite numbers, we can come up with any number

On R we can do that as well. But at the same time we can come with "vapid" statements like "for each x in R exists x' < x" which are completely correct but don't really mean anything.

True, the R's are a good foundation, and in a way they do exist physically (after all, Pi exists in nature).


Thanks for your answer, I didn't know about Goodstein's theorem

> Russel's Paradox in particular was worked around by not allowing to "build sets" quantifying over all sets (see axiom schema of specification).

Ah I didn't know that by name but this is pretty much the gist of what I'm getting.

ZFC and AC, while Axioms, seem like a better foundation than something that allows "set of all sets" which sounds like a vague specification even for a mathematician. And in the end, Russel's paradox raises the exact issue of (pardon my non-mathematical language) "Set of all sets is BS"

Hence the axiom of specification which lets you build sets that exist.

I'm not talking about Gödel or unprovable statements, because even if something is not provable it might be constructable.


> I'm not talking about Gödel or unprovable statements, because even if something is not provable it might be constructable.

What precisely do you mean by that statement? What is the thing that is not provable but constructible? A conjecture? If so, then what do you mean when you say that a conjecture is constructible? What does it mean to construct it?

As far as I know, "construction" in maths usually means a finite process. And proofs are finite sequences of strings, satisfying certain rules specified by a proof framework (such as sequent calculus). So I'm curious in what way something could be constructible, but not provable. To be honest, I don't understand how those words could apply to the same things, because proving usually applies to conjectures, statements, formulas etc. But constructing usually applies to definitions and proofs, which cannot said to be true or false at all.


> What is the thing that is not provable but constructible?

Just look at the grandparent message for an example: "An actual practical statement that is unprovable in Peano arithmetic is Goodstein's theorem"

Or proving the Collatz Conjecture. You need only basic algebra to build it, but something much more complex to prove it (if it is provable)


>I'm not talking about Gödel or unprovable statements, because even if something is not provable it might be constructable. >Or proving the Collatz Conjecture. You need only basic algebra to build it, but something much more complex to prove it (if it is provable)

It seems like your first statement is about constructibility of the same object (that should be a proof). In your second statement you talk about constructibility of an object and provability of a certain statement about an object. I may be be judgemental, but being so imprecise is a big no-no in metamathematics.


> The way math is "built", with too generalizing theorems, non-constructive statements, etc makes it fall into traps like the Russel's Paradox

I don't think Russel's paradox is a problem with ZFC set theory. Building sets out of "the set of everything" is not used, and even people are careful to specify when the axiom of choice is being used.

> Building math on saying "this exists" without actually building towards what it is, is a bit of a "sin".

It's actually not. The idea is that you prove only what you actually need for the next step. A nice example is again in the field of differential equations. Ultimately people are interested in a solution, but to get that solution you need an approximation method, and to get an approximation method you need to know when does it work and how fast does it work, and to know that you need to know when a solution can be found and what properties does it have.

Constructivism is something that in theory might sound interesting, but forcing all mathematics to fit that model only servers to make certain proofs of true and useful statements far more complicated or even impossible.


It's rather unscientific to say that we believe things because we want them to be true and not because they are inherently true.


Nothing is inherently true, things are only true based on some axioms and hypothesis. What I meant with "results that are true" is things that are consistent with our physical world.


Mathematicians find what’s possible, then leave the algorithms to the computer scientists.


Time to read "Mathematics: Form and Function" by Saunders Mac Lane, my friend.


My university had exams that involved arithmetic, matrix computation and differential equations, but no access to Wolfram Alpha though. It was impossible to pass the exams without prior training.


That's funny. I have a degree in maths, which I guess qualifies me as a mathematician, although I've always worked as a software developer.

So, the thing is that I'm not a good mathematician, but I happen to be good at arithmetic. By this I mean that I'm good with mental math (of the "square a 4-digit number in ten seconds or less" type; I'm not nearly as good as the savants out there who perform operations with many more digits, and much more quickly), but also that I like playing with numbers and I've came up with a ton of silly "theorems" [1] (I'm not sure if they even deserve that name) that are mostly based on basic modular arithmetics, so any actual mathematician might find them amusing but nothing more, while a layperson is often amazed. So, mental arithmetic can be useful to make people believe that you are smarter than you actually are :) . I'm not fond of doing this, by the way. But it's a thing that happens.

[1] Here goes an example. Take a number that is a multiple of 73, that has exactly seven digits, and that has a zero somewhere in the middle. Say, 73*75391 = 5503543. Then you can exchange whatever goes before and after the zero, and the result is also a multiple of 73: 3543055/73 = 48535. For added WTF, I'd like to mention that it also works with 137: 137*8621 = 1181077, and then, 7701181/137=56213. The proof is surprisingly simple, but you need to know what to look for.


A great book for this is "Secrets of Mental Math" by Benjamin and Shermer [0]. I remember a feeling of fun when I memorized the tricks many years ago, but unfortunately I lost the digital flash card deck for reviewing them, so I've forgotten them now. I should recreate it.

[0] Goodreads link: https://www.goodreads.com/book/show/83585.Secrets_of_Mental_...


I came up with my own shortcuts for mental math back when I was in middle school. But since then whenever I've needed to do anything more than a simple calculation I've just reached for a calculator and have basically lost the ability to do the longer work in my head. While it would have been nice to retain that ability without continued practice, pulling out my phone's calculator is simply faster and won't make an error like I can easily in my head when trying to remember the all the intermediate numbers I'm calculating.


Calculators are preferred for important calculations, though upon further thought, I remember reading that mental math is prized in the finance sector.

Out of curiosity, a search on r/FinancialCareers provided anecdotes as some evidence that it mostly helps you seem smart to higher-ups, and also does have a niche role in prop trading [0]:

1. "Prop trading firms highly value quick mental math. I work at a prop in Chicago, specifically market making in the broker market on CME products, and very fast mental math is required as part of my daily job. I calculate my hedges using mental math, and it’s absolutely crucial I’m quick (futures can move fast). The mental math I do daily is mostly multiplication and division of two-digit numbers, as well as some fraction multiplication. Although prop trading firms are few and far between now, so it’s a niche field and a skill with little required use outside of trading."

2. "Youll find that aa lot of politcal wins are made in meetings with higher-ups. These people often have little time and want things boiled down to a mental math state (they dont have time for you to fix your model in real time). This is where quick mental math can get you a good impression. If an exec has a question surrounding the effect of something that isnt in your model, you will need to do quick mental math to figure it out. This exec doesnt have time for you to do more than 5 key strokes. It is important as it makes you seem like a thought leader."

3. "Extremely relevant. Will help you impress in internal and external meetongs"

[0] https://old.reddit.com/r/FinancialCareers/comments/c7c38l/is...


Quick estimation and sanity checking are useful skills even if the real program is going to be much more complex.


Arithmetic (mental math) is a skill that must be practiced like juggling. I used to be a high school math teacher and spent a few hours a day working problems on the board. I got to the point where I could work through stats problems with 3 decimal places about as fast as the students could do them with calculators, and I didn't need to write out my work (although I did). Now 5 years after leaving I prefer to reach for a calculator when adding or subtracting 2 numbers, just to make sure I don't make a mistake.


Watching math professors working problems on the blackboard I am constantly amazed at how rapid they are. They have the training advantage that when they make a mistake, which is not uncommon, someone in the class will immediately point it out.


Also, we do them often. Some problems we use every year (CS teacher here, but I assume the same applies)


> Why Are Mathematicians So Bad at Arithmetic?

Why are Programmers So Bad at Programming Bug-Free?


Because we spend 90% of our time in meetings?


I'm curious how much of that is attributable to aging.


It could possibly but when I studied math in college I got incredibly good at mental math. A lot of it is memory or number tricks and a bit of practice. I would say anyone can learn them regardless of age. I am many years out of college but even by the time I was in my mid twenties and a few years out of school my ability to do simple problems in my head went down drastically since I wasn't doing it nearly as often. Age may play a factor but probably not as much as you would think.


Not all mathematicians are bad at arithmetic. Both Euler and Gauss were brilliant calculators. Another comment mentions Ramanujan. In more recent history, Arnold in his exams would give problems with plenty of arithmetic estimates, that had to be performed orally and be within 10% of the exact answer.

In a surprising turn of events, people studying mathematics are a large group, and some of them are indeed incapable of factoring 57.


The question in the article title sounds to me like:

> Why are people from the USA bad at navigating the New York subway?

And you're right to point out:

> People from New York aren't.

If people are surprised by the idea that many mathematicians are bad at arithmetic, then I'm guessing they also think that arithmetic is taught first because it's fundamentally foundational, rather than something to do with Diophantus having lived before Erdős.


51 looks kinda prime, too.


Well, theoretical studies teach you about navigating, managing and creating complex structures;

life teaches you to distrust and verify all appearances.


What we need now is someone giving an anecdote about how they're a published math researcher but at the job interview they screened him with Times Table Rock Star and gave him a take home to solve 20 quadratic equations.


> Times Table Rock Star

When I was a child, the schools tried to get me to memorize the multiplication tables. At least through ten, teachers would say. I thought it was pretty silly.

You shouldn't force memorization. It should come naturally as you use certain things many times. If you use 7x6 often, you will remember it. If not, you will forget it. That should be totally fine. Why should it matter if you memorized 7x6 or you had to go the long way and say 7x5 is 35 so add 7 to 35 and you'll get 7x6?

Imagine if you said, good programmers are often fast typists and some idiot decided to add a touch typing test to their hiring process.

» Goodhart's law is an adage often stated as "When a measure becomes a target, it ceases to be a good measure".

Edit turn asterisks to x. Please read x as multiply.


I'm a mathematician. When I was in grade school they told us to memorize up to twelve. But I figured I could skip the evens, because doubling is easy, so I would just memorize the odds. One is trivial, three and five are simple, as is eleven. There is an easy algorithm for mine. So I memorized seven and called it done.

When I think of the thousands of multiplications I've done since then, it's clear I was a dope and my teachers were perfectly right.


I noticed that there's really only a 4x4 square in the multiplication tables (6-9) that aren't obvious, and since the table itself is mirrored along the diagonal, there are only really 9 entries you truly need to memorize: 6x6, 6x7, 6x8, 6x9, 7x7, 7x8, 7x9, 8x8 and 8x9.


The only unintuitive element of the times table is 8*7=54.


"8, 7, 54" !?

I do not get the joke. What do you mean?

Suppose now somebody threw in that expression seriously. Would not that be an admonition to learn the tables and learn each element properly (do, check, memorize; do, check, memorize, check, check, check)?


Not sure if there's a separate in-joke involved, but 7x8=54 was a joke in the article.


I Disagree. This "you only need one eye and one ear - two is redundant" approach to memorization is a root with the dire situation of today, in which I know youngsters with some diploma who can go "I thought that 5x8 was [any number]. Really it is 40?!".

While studying, in general, mnemonic hooks pose a foundation for further notions to become solid. Then, the whole of them become available micro-tools for reasoning.


Hard disagree. The multiplication table is incredibly useful when making back of the envelope calculations.

You can cheat though. 2, 5, 9, and 11 have simple deterministic algorithms you can apply to get the result. These help, but the multiplication table is one of the only useful maths concepts that's stuck with me from grade school.


I've gotten pretty good at deconstructing (a+b)c into ab + ac and working that out in my head - for example, 36 x 3 is (30+6) x 3 = (30 x 3) + (6 x 3) = 90 + 18 = 108. Couldn't do it if I didn't remember the multiplication tables up to 10 though.


Back of the envelope means you can estimate so you don't need it memorize the whole table.


You need those key→value promptly available when you are actually performing the "back of the envelope" computations.


Remark #2 on the same:

look, rightly, you do not need to memorize them tables by recitation¹: but consistently, then, you should (as a learner) actually perform all of them multiplications for as many times as needed to make you know them results as if you memorized them even in a less "experienceful" manner.

¹Which is not at all necessary, by the way - on the opposite. When you go, "5x6=30", you are not supposed to keep that in a mental veneer while you think of some random else: you are supposed to keep that thought rooted to its foundations made of structures built on (six and five) sets of five and six, and internally see it as evident. You can recite with presence and awareness.


I did something like this to learn the times table in grade school. What the school wanted us to do seemed ridiculous and extra boring, and I figured that if I just kept a visible table handy while doing the multiplication exercises, it'd go into memory along the way "for free". This worked.

Maybe it might not work well for others, though you'd expect there's some learning method better than how they did it in my day, on both effectiveness and fun. (I forget exactly how we were supposed to do it.)


If you are unable to recite 7x6 and reach for the calculator for it, I don't think many people will memorize it.

I did the same, up to 20. I've used it so much since then that even though it seemed silly at the time of learning, it has proved that my total time saved is far greater than the time spend memorizing it. It helps that I went to STEM for university and in general are interested in it. But I'd say over a lifetime average Joe/Jane that handle a normal persons finances will at some point have saved more time than spend learning it.


> It should come naturally as you use certain things many times.

But if it's not second nature, you may decide not to do it many times. Is it "need this -> gets memorized" or "memorized -> can use as tool"?

Personally I've never been a proponent of memorizing things, preferring to Google when needed, but I can see how the argument runs.


> You shouldn't force memorization. It should come naturally as you use certain things many times. If you use 7*6 often, you will remember it. If not, you will forget it. That should be totally fine. Why should it matter if you memorized 7*6 or you had to go the long way and say 7*5 is 35 so add 7 to 35 and you'll get 7*6?

As it turns out, though intuitively appealing, this idea is wrong.

You can't learn to touch-type by programming. You have to focus some effort on specifically the touch-typing aspect of things. Half an hour a day of directed practice at touch-typing for a month will improve your typing speed more than ten hours a day of programming for a year. Similarly with multiplication tables: if you devote the necessary effort to memorizing the 28 or so necessary facts and speeding up recall on them, your speed at long multiplication increases dramatically. In my childhood I refused. When confronted with a multiplication problem, I would get medieval on it: mediation and duplation! This was a bad strategy.

In fact, I've often wondered if committing more lookup tables to memory might help more. Consider the Briggsian logarithms:

    log₁₀ 1.1 = .0414
    log₁₀ 1.2 = .0792
    log₁₀ 1.3 = .1139
    log₁₀ 1.4 = .1461
    log₁₀ 1.5 = .1761
    log₁₀ 1.6 = .2041
    log₁₀ 1.7 = .2304
    log₁₀ 1.8 = .2553
    log₁₀ 1.9 = .2788
    log₁₀ 2.0 = .3010
    log₁₀ 3.0 = .4771
    log₁₀ 4.0 = .6021
    log₁₀ 5.0 = .6990
    log₁₀ 6.0 = .7782
    log₁₀ 7.0 = .8451
    log₁₀ 8.0 = .9031
    log₁₀ 9.0 = .9542
Consider the problem of computing the horizontal scan frequency of a minimal viable text terminal CRT. It refreshes at 60 Hz, contains 24 lines of text of 8 scanlines each, and has a 10% VBI, so the problem is multiplying 60 · 24 · 8 · 1.09. In Magic Logarithm Land, we just have to add the logarithms. We can mentally interpolate log₁₀ 24 as 1.30 + .4 · (.48 - .30) = 1.38 and log₁₀ 1.09 = .9 · .04 = .04, so we have 1.78 + 1.38 + .90 + .04 = 4.10, which is between 4.08 and 4.11, so the answer is about 12800 Hz. The exact answer with the multiplicands as given is 12556.8, so that's less than 2% error, as it usually is when rounding to two places. Even that is excessive precision given the fuzziness implicit in the VBI figure.

I can't actually do this mentally, partly because I haven't memorized even the list of 17 logarithms listed above, so instead what I end up doing is something like, well, 24 · 8 is about 25 · 8, which is 200, and ×60 gives 600 + 600 = 12000, and then we add 10% for 13200, and then we'll be about 5% high because of rounding up the 24 and the 1.09, which also gives 12700 or 12800. But I can't help thinking that mentally adding 78 + 38 + 90 + 4 would be easier!

Memorizing a quarter-square table for the integers up to 100 would also help with this kind of thing.

Agreed, though, that a typing test for programmers or a times-table test for mathematicians would be counterproductive.


Sure having your tools ready - in some cases this means, memorized - will be useful when needed, but.

Consider your example of the CRT. You are for interpolating, I tend towards factoring - no issue.

First we memorize the log₁₀ of the main factors:

  2→ .3 , 3→ .48 , 5→ .7 , 7→ .845 , 11→ 1.04 , 13→ 1.115 , 17→ 1.23 , 19→ 1.28
...a good exercise that may allow us, potentially, to perform mental computations through simpler operations within a 1% error margin.

And we could, in the example, perform (.78+1)+(.9+.48)+.9+(1.04-1) = 4.1 . Good. But now, one still needs a calculator to perform 10^4.1 ... :) This does not happen when we mentally multiply - we have ways to remain within acceptable approximation and still not need "crutches" printed in paper or silicon.

Though of course, again: better with than without resources. For what I am concerned, that handful of factors will be memorized with priority not inferior to the date of the battle of Bosworth.


You don't need a calculator to perform 10⁴·¹; 10·⁰⁸ = 1.2 and 10·¹¹ = 1.3, so (interpolating) 10·¹ is about 1.27, so 10⁴·¹ is about 12700. (The truth is about 12590, but that's within the margin of error.)


You are right, I did not realize.

But one should use more precise values to make it work.

10^0.08 = 1.202 and 10^0.115 = 1.303 : 20/35 in the distance from 0.08 and 0.115, where you have 0.1, you likewise proceed with the same proportion on the results and obtain 1.202+0.057 = 1.259

Thing is, practice is required, and experience in managing precision confidently in this realm.

Edit:

In fact, the linear interpolation can work surprisingly well, but the space in linear vs logarithmic remains warped, and it is easy to err without realizing.

I just threw those numbers in a calculator to verify it:

  l12.d = Log10(1.2)           ==>  0.0791812460476248175522684
  l13.d = Log10(1.3)           ==>  0.1139433523068367759556452
  dst.d = (0.1-l12)/(l13-l12)
  lrs.d = 1.2 + 0.1*dst        ==>  1.2598892190166359750236325
  Log10(lrs)                   ==>  0.1003323596533426953492096
  Pow(10,0.1)                  ==>  1.2589254117941672816982646


Sure, especially with very imprecise values like 1.2 and 1.3 the error can be quite significant; generally you only interpolate tenths to get a single extra digit of precision, because mentally multiplying or dividing by .2 or .3 is a lot easier than multiplying by 20/35 or something. In theory you could use a more precise form of interpolation like quadratic Hermite interpolation or cubic spline interpolation, but for mental work I think that almost never pays off.

You may be aware that the linear interpolation error for a regular function scales as the square of the interval size, so if the table you're interpolating from has 10× as many entries, you get 100× less absolute error.


Sorry but I have difficulties in reading your intention in

> We can mentally interpolate log₁₀ 24 as 1.30 + .4 · (.48 - .30) = 1.38

Easily one could go log₁₀ 24 = log₁₀ (6*4) = .78+.60 = 1.38, for example,

but you went something along the lines of "log₁₀20 + .4 · log₁₀(3/2)", making "24" read as something like "20x(3/2)^(4/10)"... What was the intended process?


Sorry! I was saying that 2.4 is between 2.0 and 3.0, whose logarithms are .3010 and .4771. The distance from .3010 to .4771 is .1761, or .18 to two places, so if we linearly interpolate .4 of the way from .3010 to .4771, we get .3010 + .4 · .1761 = .3010 + .0704 = .3714. Doing it with two-place logs, we get .30 + .4 · .18 = .37. The correct value for log₁₀ 2.4 rounds to .3802, so the linear interpolation is not very precise.

In fact I happened to make an error in the mental interpolation that coincidentally was in the right direction: I estimated .4 · .18 ≈ .4 · .2 = .08, giving me the more correct .38 by luck.

It is of course true that if you factor your numbers you can get better precision with fewer table entries, as you did. But I think it's easier to mentally routinize linear interpolation between table entries so you can do it instantly than to mentally routinize factorization. Such mental linear interpolation was commonplace when using physical printed books of logarithm tables because it easily gives you, say, 4-place precision out of a 3-place table, allowing you to use a book that's one tenth the size and contains one tenth the incorrect entries.

Even very small amounts of memorization can give you lots of logarithms if you do mental work with them. With just log₁₀ 2 ≈ .3010 (and log₁₀ 10 ≡ 1), we can easily derive log₁₀ 5 ≈ .6990, log₁₀ 4 ≈ .6020, log₁₀ 8 ≈ .9030, log₁₀ 2.5 ≈ .3980, log₁₀ 1.6 ≈ .2040, log₁₀ 1.25 ≈ .0970, etc. If you additionally know log₁₀ 3 ≈ .4771, you can easily get 6 = 2·3, 9 = 3·3, 1.2 = 6·2/10, 1.8 = 9·2/10, 2.4 = 1.2·2 (as you said), 1.5 = 5·3/10, etc.

Equal temperament takes advantage of the fact that 3 is very close to 2 to the 19/12 power, about 0.3% larger; note numbers are frequency logarithms to the base of the 12th root of 2. Unfortunately that isn't very helpful if the numbers you need to calculate with are given in decimal form...


The mathematicians are probably wrong. Mathematicians will have above average ability with arithmatics, but they're going to be meeting people who memorise pi to 50 decimal places or happen to calculate the sum of small cubic numbers for fun to pass the time and are really good at arithmetic.

What the mathematicians are probably cluing in to is the fact that around half of mathematicians are below median arithmetic ability in their immidiate academic community.


> Mathematicians will have above average ability with arithmatics

They dont. My family has multiple ma mathematicians in it and as a kid I spent a lot of time with their colleagues on activities. Many do regularly huge mistakes in basic arithmetic like multiplication or addition. Some of them do "memorise pi to 50 decimal places" or learn some other "stunt" for fun of it or in order to impress people. But they all treat it more of as a stunt then something meaningful.

It just does not matter for work of mathematician whether you are good with arithmetic. So they dont care about becoming good at it. They just joke about it when someone makes mistake yet again.


> but they're going to be meeting people who memorise pi to 50 decimal places or happen to calculate the sum of small cubic numbers for fun to pass the time and are really good at arithmetic.

Not really. The "above average ability" depends on the population: if you compare mathematicians with other college educated people, I'd guess they'd be at the bottom among the STEM fields. It wasn't weird to see people (or myself) having the occasional "wait, what's 5 times 7". And it shouldn't be surprising if people knew that 99% of the arithmetic operations done in math degrees are things like "2+1" or "2·3".


It's unsurprising that abstract mathematicians are worse at engineering math than engineers, and vice versa for abstract math.


What I liked about math in engineering school was that after a certain point, it was really all about numerical methods. Because it is such an applied field, of course you want actual numbers - that's your bread and butter. You first learn the basics, the same way a mathematician learns something, but afterwards it's all simulation and approximations.

During grad school, however, there was a complete shift from this - IMO. You suddenly had more classes with with math for mathematicians. Of course, this was because you took a lot more advanced and cutting edge classes, which were often built on various fields of math. New tech often follows that route...breakthrough discoveries in math or physics, which is often described in dense mathematical language. Before it eventually gets more applied and approximated forms, which are more suited for real-life uses.

I've also noticed that working engineers tend to be much better at mental computation, than most researchers / scientists. But that obviously comes from years of work experience.


> Because it is such an applied field, of course you want actual numbers

That's why you're an engineer and not a mathematician ;)

Because, no, _I_ do not want numbers if I can avoid them. Proofing that something has one (and exactly one) solution is (for example) way more interesting than actually computing the result - this is that boring that even computers can do (some of) the work for us.


I doubt that the premise is true. Self-reporting is not a reliable indicator of actual performance in this case. It seems plausible that an empirical study would find mathematicians way above the average in elementary arithmetic, though possibly trumped by waiters and cashiers who often have to calculate without aid.

Of course, without an actual study this is just speculation.


Because arithmetic is completely solved and boring, duh.

There was a time when it was at least a little bit interesting, and many mathematicians were actually quite good at it.


There are clever techniques for doing multiplication like: Logarithm tables, prosthaphaeresis ("trigonometric logarithm tables"), quarter-square tables, abacuses. More recently (that is, within the last 60 years) faster than O(n^2) algorithms have been found as well, with a time complexity of O(n log n) having been achieved. These last algorithms range from being impractical to do by hand, to being outright galactic.


I was just about to post this. Arithmetic starts going to the moon once you're dealing with larger than 64-bit numbers. Stuff like Karatsuba's algorithm or you can read all the fourier transform convolution hacks libraries like mpdecimal use. Arithmetic is also very dominant in any sort of low-level programming. It's just that it isn't the same kind of arithmetic that math is used to dealing with, or even likes dealing with, since it's usually over a field that mixes boolean logic with arithmetic. It actually ended up being a security issue because malware authors would find ways to obfuscate their programs using types of math that haven't traditionally been studied, so math tools would be completely powerless to make sense of them. So definitely not a solved problem.


I didn't know about prosthaphaeresis! Thank you!

Richard Guy developed a single-scale nomogram based on elliptic curves in 01953: https://www.jstor.org/stable/3609499

His explanation is wonderfully simple; quoting the beginning: "Since the equation x³ + ax + b = 0 has zero for the sum of its roots, the x-coordinates of the three intersections of the line y = mx + c and the curve y = x³ + px + q add to zero." It may be entertaining to attempt to derive the rest of the nomogram from that sentence and the use of logarithms before consulting the (one-page) paper.

A nice advantage of Guy's contrivance over slide rules is its facility with squares and square roots. On a conventional Oughtred slide rule you can easily enough read off the square root of a number on the A or B scales by reading across the hairline to the D or C scales, respectively; but if your square had been computed on C or D, you are out of luck. Guy's nomogram has some similar limitations, but you can in general easily take the square root of any point on it.


> Because arithmetic is completely solved and boring, duh.

What is the fastest way to multiply two integers? Prove there is no faster way.

Do the same for division.

There. That will keep any mathematician busy for a while.


Mathematicians who care about the speed of doing a calculation are usually just called Computer Scientists.


I wouldn't get it mistaken. This isn't a study that actually proves that they are bad at arithmetic it's just a fun little anecdote.


Indeed. Generalization excluding many other variables as well.


Seems plausible. Similarly, a computer scientist may be bad at writing computer programs. (And this may also surprise someone outside the field.)


Contest-style rapid numeric calculation is a special case of "math". Many kids in math clubs focus on both — a significant part of contest math is word problems that have to be interpreted correctly and then calculated correctly, under time pressure — but calculation has little to do with higher math other than being able to validate symbolic results.

Fast calculation is about drilling and learning tricks. People can get pretty accurate and fast at simple arithmetic as long as numbers fit (or they've learned tricks to chunk them) into their working memories. If there's no inherent motivation or necessity to maintain those skills after school, they'll decay.

As the article points out, accountants and engineers are likely to be better at being human calculators. I'd guess accountants first, because engineers do more complex calculations and it'd be much more efficient to rely on computers for most of that. Accounting is focused almost entirely on arithmetic that's possible to be really good at without computers, and less on developing complex formulas where it's simpler to write them out on a computer and then let the computer plug in values and solve. For simpler math, inputting and transcribing numbers from a calculator or computer could end up being the limiting factor.

The other aspect to being good at arithmetic is not being particularly good or fast at doing the exact calculation, but rather at estimating what the results should be, so that if the computer gives them the wrong answer, they know almost immediately that they typed something wrong.


I am waiting for a follow up: "Why are accountants so bad at proving theorems?"


Why would anyone be good at arithmetic? Who does their own arithmetic?

I understand that before smartphones it was less common to have a calculator in your literal pocket all the time, but you’d have one at work. You’d have one doing your taxes. Even you child would have one at the bottom of the toy box or left on the floor somewhere.

That was the state of things for decades.

I recently taught my daughter long division. Home schooling; it’s a requirement. I have to be honest, so I can’t tell her it’s important. She wants to know why it’s in the requirements. My best guess is that it was a job skill in the 1970s and that the education system has a lot of bureaucratic inertia.


I did long division on paper yesterday, to convert swimming-pool evaporation rates given as millimeters per day to nanometers per second. Probably a skill I use every week or two. It's true that I could do it faster with a calculator.

Perhaps a more practical example was that the day before yesterday my father told me that the recent collapse of an ice shelf off Antarctica was likely to raise sea levels by 27 inches in the next three to five years by debuttressing [what turned out to be] the Thwaites Glacier, so possibly the seaside camping spot he was enjoying that day would be under a foot of water in a few years. I immediately protested that this could not be correct; he must have misunderstood the research, because it was the wrong order of magnitude for such a short time period.

And in fact it was the wrong order of magnitude, and the article he had read was deliberately sensationalized by juxtaposing sea-level rises that might occur over the next 80 years with events that will happen over the next three to five years. So his campsite will be fine.

Didn't need mental long division for that, though.


I’m terrible at arithmetic, but one very useful skill I have (probably left over from my slide rule days) is doing 1 1/2 digit calculations with exponential notation in my head. Like your evaporation problem.

These days, though, I have a tendency to just raise my wrist and ask Siri, “How many inches to the Sun?”


While you certainly can calculate everything easily on a calculator, I think there's a value to being able to do your own arithmetic easily, in that this transfers over to good estimation skills and having a sense for when you typed the wrong thing into the calculator. Not something I have a proof of, though.


My teachers used to call that intuition "number sense".


The simple reason is that what you think is math is actually not math at all.

When I started studying theoretical math, on my first day, I was told to completely forget what I have learned in school because "it is not math". And then when we talked with our professors about them giving lectures for other students (studying electrical engineering or physics, etc.) we would hear from them something like "Yeah, these guys think math is hard but they don't actually study math".

In short -- adding or multiplying things, applying numbers to formulae -- isn't math.

Actual mathematicians do not need to calculate a lot of stuff in their heads (though estimating things is sometimes helpful) and don't need to remember formulae (although if you spend any time doing actual math you will learn a lot of formulae). And if they need to they are just as likely to pull out a calculator or Google as any other person.

And funny thing is -- before I went to study math I have attended trade school. There, besides a lot of useful topics like economics, accounting, law, touchtyping, etc., we had one semester of actual arithmetic. Which was actually about calculating stuff in your head, fast. We would spend ENTIRE time learning tricks to calculate things faster. And the exams would consist solely of sheets of paper with columns of things to calculate (multiply these three six-digit numbers, etc.) which we had to do in memory.


Mathematics is about abstraction, it's literally the study of objects. Different foundations of math define the fundamental objects of the universe in different ways. You have set theory, where everything must be constructed from the empty set, and operations of union, intersection, complement, and inclusion/comprehension. https://en.wikipedia.org/wiki/Set-theoretic_definition_of_na...

You might say, how the hell do I get an ordered pair (x,y) from sets? Well, this was solved in a myriad of ways. https://en.m.wikipedia.org/wiki/Ordered_pair#Defining_the_or...

You also have category theory where everything is a graph and you start with the global object: 1 -> N then build all numbers recursively. https://en.wikipedia.org/wiki/Natural_numbers_object

And then you keep on using these complex objects to build even more complex ones. So you have a consistent system. And you derive truths about the system, these get called proofs. Arithmetic is child's play - it has nothing to do with consequence or truth. Math is all about starting with agreeable rules and deriving profound consequences that are a kind of umbrella that those rules cover, and sprout from. Canonically, these get called axioms. And we like to choose ones that agree with our common perception of the world -- ie. the Peano axioms for how objects physically work when you combine them, split them, try to group them into rows and columns (prime factorization), etc..

Numbers are one of the simplest and fascinating abstractions because they take the idea of object and unify everything under it -- so two cats and two dogs get typecast to two objects and two objects, then you can add them and get 4 objects. Numbers are basically a reflection of the most primitive type cast possible in our universe of thought. Such, number theory is called the Queen of Mathematics. Because number theory seems to relate much closer to our actual physicality than other axiomatic systems.


This rings true to me. I'm a good software engineer, decent at "real math" and close to hopeless at arithmetic to a level that has had me wondering if I suffer from some version of dyscalculia.

This is very confusing to friends and family who assume I should be some kind of savant with numbers until I explain "oh no, in fact the first practical program I wrote as a kid was one to do all my second or third grade math homework for me."


The reason is that arithmetic is a muscle memory that atrophies without use, and no one outside of school benefits from doing arithmetic by hand.

The same is true for algebra and calculus. Mathematicians who teach keep their memories fresh, but no one can pass calculus exams without practising.


Forget arithmetic; I remember conversations with exceptionally capable academic mathematicians who in many cases couldn't count without making mistakes. The likely reason: every number triggers a symphony of mathematical results and complicated ideas in their minds, from both memory of association as well as logic itself, and by necessity their minds also conjure up new concepts every now and then. All this added mathematical machinery interferes with the actual counting process at hand.

In an extremely interdependent field like mathematics, maybe the more you know, the harder simple operations become, because our short term memory inevitably gets occupied and interrupted with all the more complicated stuff.


Sometimes a cigar Is just a cigar. I'm a mathematician, I love numbers. But believe me, many do not. And even in a considerably insistent way, they define themselves literally about it.


arithmetic is a subset of mathematics but its a part that gets in the way of understanding

you cant prove things abstractly using numbers for the most part, you can test cases

in a very glib way:

arithmetic: 1,2,3,… i bet you i can count really high. Arithmetic is integers and while those are cool they are ordered

mathematics: what kind of objects behave this way can i prove there are finitely many or infinity many. The field of complex numbers is not ordered

very different

there hasn’t been a universal mathematician for a very long time and you must choose for your own sanity and productivity what you’re gonna spend your energy on

are you going to calculate then don’t go into “modern” abstract mathematics

you have to choose what you’re priorities are going to be and everything else has to take a backseat


For the same reason why my reply to the question: "Hey, can you fix my printer" has always been: "No, I cannot."

Just because some things are very losely related, doesn't mean the skills between the two are transferable.


You don't think you could troubleshoot an average printer problem?


No, I don't, because "an average printer problem" usually translates to

"I bought these ink/toner cartridges from one website, and the sledge they go into from another one, and I managed to wedge them into the printer somehow even though now the hatch no longer closes properly and it makes funny noises when I start it, but the light is slighly more orange than last time, so that's a good sign, but it cannot fetch paper from the tray now, could you have a look at it? Because you're so good with the tech stuff..."

Sorry, but my proficiency in software engineering doesn't exactly qualify me for handling things like hatches, hinges, tiny gears, levers, springs and plastic boxes that don't fit well together.


If the person asking me to help them couldn't figure it out, why would I have a better chance?


Because they presumably asked you because they think you can do it.


For the same reason a biologist doesn't know the names of all trees by sight.


It's funny, in grade school we used to have "speed tests" where you had to add, subtract, multiply numbers as quickly as possible. I used to always do terribly, easily one of the slowest in the class.

Then I remember we got into high school and started doing more abstract math. The kids who were amazing at doing math quickly seemed to struggle with this, but I did just fine.

I'm not a mathematician or anything, but this article at least validates my belief that doing arithmetic fast in your head and doing more analytical math involve two completely different parts of your brain.


Well, not mathematicians. But I once was a physics student. We had a professor which was usually brilliant. But we had a few classes involving arithmetic: Dimensional analysis or practical examples. He would scorn students that took out their calculator and say that any decent physicist could work it out within a few percents margin. Then he would work it out on the blackboard, and every single time he would be wrong by orders of magnitude and we all tried our best not to laugh.


> When I began to teach 3D vectors two years ago, I realized I first had to teach it to myself, because I’d never actually learned it. My college courses skipped straight to “n-dimensional vectors.”

This sums up why I disliked higher level math classes in college so much. We never actually did anything _useful_ with the math, we never actually applied it to real life, we always just rapidly advanced to the general case ("n-dimensions"), took a test, then moved onto the next topic.


That's just why some people like it. My favourite review on Amazon, of Rudin's Principles of Mathematical Analysis (https://www.amazon.com/gp/customer-reviews/R23MC2PCAJYHCB/re...):

> It is not possible to overstate how good this book is. I tried to give it uncountably many stars but they only have five. Five is an insult. I'm sorry Dr. Rudin....

> "The material is not motivated." Not motivated? Judas just stick a dagger in my heart. This material needs no motivation. Just do it. Faith will come. He's teaching you analysis. Not selling you a used car. By the time you are ready to read this book you should not need motivation from the author as to why you need to know analysis. You should just feel a burning in you chest that can only be quenched by arguments involving an arbitrary sequence {x_n} that converges to x in X.

>... if you'r a student and find the book too hard? Try harder. That's the point. If you did not crave intellectual work why are you sitting in an analysis course? Dig in. It will make you a better person. Trust me.

> Or you could just change your major back to engineering. It's more money and the books always have lots of nice pictures.


Bourbaki themself wrote a review on Amazon?

{x-n}


I think it’s also that mathematicians rarely care about the answer, more about the process.

Figuring out the volume of a complex shape with size scale x? Yes please! Missed a factor of pi/2? Meh who cares?

Multiplying two numbers? Oh look one is twice the other one, so it’s a perfect square times 2. Got the squaring wrong? Meh who cares.

Then even if you do care, it’s hard to shake old habits.

I can’t imagine an accountant saying “ooh here’s an interesting method for multiplying times the tax rate of 13.75%. Got the counting wrong? Meh”.


I think this is a crucial message for young students who feel inferior to that human calculator classmate.


Thanks for sharing this. I feel seen :)


I do math art (https://gods.art) and Machine Learning, but I pull out my phone calculator to tips, and my most hated class in school ever was 5th grade math (long division).


i learned multiplication table up to 10 time 10 during my phd, while doing 100+ pages long tensor/spinor calculations. at some point it looked more effective learning 6x7 by heart that using a calculator.


Mensa members - 25% can't do math, 10% can't do arithmetic. From a 'Mensa puzzle-a-day' book with answers in the back and percentages of Mensa members getting them right.


Arnold may have partially explained it in https://www.math.fsu.edu/~wxm/Arnold.htm


But is there a perception that they are? I mean, in my home country, the stereotypes about mathematicians usually involve them being out-of-touch with the material world, so to speak.


I believe it.

I remember from my student days, a couple of M&CS students were playing some game, and afterwards, the mathematician managed to fill an entire A4 sheet trying to add up the scores.



Stanislaw Ulam (from his book Adventures of a Mathematician): "There are three types of mathematicians. Those that can count and those that can not."


Does anyone else feel like we should be teaching programming in math classes and implementing proofs instead of writing out arithmetic?


You need mental energy for arithmetic. Mostly gone when you are old enough to be called mathematican


why are (some) computer scientists so bad at software engineering?


What a great example of the Bulverism fallacy! Love it!


Isn't this a little bit like "Why are computer scientists so bad at fixing my router / printer / virus infected windows laptop?" :)


Simple: we all know that Computer science is no more about computers than astronomy is about telescopes.

(Something like that can be said about mathematics, I am sure.)


When I was in school back in the 90's and worked for the computing center supporting Unix workstations on campus, it was always funny to get called over to the CS department to fix a professor's workstation and discover once again that they had zero clue how to use their machine. These were the same professors I had in my CS classes, who were fantastic. They were 100% theory and not at all practical, and actually weren't the least bit ashamed to admit it.


One of my professors answered “How do you go about debugging your programs?” with “I don’t debug, I prove my code correct before typing it in.” Not very useful.


I've worked with people a bit like that.


Yeah, I think it is similar to how some software engineers are more into writing tools and libraries than creating a product that common people use


Because something broke inside following the times in which they were requested to help with your "virus infected mouse device". (Sorry.)

Or those times "What do you mean do not open the virus? I tried to open it, I clicked on it a thousand times, nothing happened!". (Both are very real and lived, I am afraid.)


Oh, you computer scientists!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: