I've discovered Calculus Made Easy recently and it's wonderful. The edition edited by Martin Gardner is particularly good, with amazing preliminary chapters [0].
I've spent many years "pretending to understand" calculus, but things I remember gnawing at me, like limits & infinitesmals, are accompanied with context and history such that you can finally put yourself into the conversation and understand that my confusion is simply due to only getting a fraction of the story.
[a critique, not a request for help] I found it relied too much on faith: you can ignore this small quantity, but not this one. What's the threshold? Why?
Fractional and negative powers are assumed to work as a generalization of positive integer powers, without proof.
But, TBF, all maths education requires a lot of faith. e.g. the unique prime factorization theorem is assumed in high school, not proven.
If I recall correctly, the author explicitly states in the introduction that mathematicians will hate the book precisely because it skips over proofs and takes a pragmatic approach. If you're the kind of person who wants to understand the proofs before using them, you will need to supplement this book with other material.
He spends much time on "minute" quantities at the start, but the explanation doesn't really make sense. To me, it's a mental model I can't trust, like rickety stairs.
The book is explicitly calculus-as-a-bag-of-tricks; monkey-see, monkey-do. As he says:
What one fool can do, another can.
(Ancient Simian Proverb.)
Fair enough on proofs. BTW in the free gutenburg edition (maybe MG differs): "prologue" doesn't mention proofs, but that textbook writers make it difficult (no "introduction" - also no "preface", except to the 2nd ed):
> The fools who write the textbooks of advanced mathematics ... seem to desire to impress you with their tremendous cleverness by going about it in the most difficult way.
Yeah, this sounds right. I think MG contextualizes it a little bit more. I actually like the book for the exercises and MG's footnotes especially, but as I worked through it I had to consult other resources to make sure I wasn't missing anything. The Khan Academy Youtube channel was very helpful for this.
Yes, but if you think faith alone is not enough you can always drop back to the sacred texts of the proofs.
But it is weird not being shown the proofs for things like generalization for other situations. I studied Calculus while in engineering at college and we always got the proofs for the tricks we were using.
If you just want to use a tool at a basic level, you do not need to be explained how it works. If you want to master a tool and be able to use it in novel situations you need to understand how the tool was created. Same with math proofs - without knowing why something is true and how it was discovered, how can you utilize its full power?
Personally, working through the proof until I have a flash of insight is how I develop intuition that allows me to know when and where the "trick" should be applied, which is often more broad than the narrow context in which it is presented through the materials. You may even develop some new tricks of your own.
You want to prove it to yourself at least once I think. Even if you only need the result, it’s nice to have at least once in your life a complete understanding of what you’re using.
I don’t remember many proofs that I did, but I’m happy to know the maths that I still remember really works.
While not a math wiz myself— I understand why you would want proofs, but that can easily be taken to the extreme. Should you be measuring the gravitational constant every time you need to use it as well?
Seems like a good path to learn high level abstractions. As you progress in understanding you dig deeper.
Maybe I'm even still too much a sprite—but when I first learned anything about computers the first program I wrote was in a high level, simpler language. I wasn't moving bits around with explicit knowledge of where they were going.
Then again, maybe it's not a fair comparison on my part?
To some extent you're right, the benefit of a higher-level abstraction is using it without knowing the details. For standard usage, on the "happy path", this is fine. But if you need to modify techniques, or debug them, it's kind of impossibly frustrating without actually knowing what you're doing!
BTW computers have much cleaner abstractions than mathematics. e.g. the JLS defines Java independently of hardware; IEEE 754 is similar for fp arithmetic. There are specifications all over the place.
But my experience with mathematics is completely different - you have to understand the lower level to understand the next level.
In my personal journey, I started off with your perspective of just learning the higher levels that I directly needed. It was very difficult, but after heroic efforts, I made breakthroughs! After a while, I noticed these "breakthroughs" were mostly entirely to do with material from lower levels... So I went back to them. This happened again and again, going lower and lower. Now I'm basically re-doing high school maths.
Heheh. That’s not too far off from me. Though my progress has been delayed I was starting to circle back through many basics even taking high school courses to prep for a return to university. Hasn’t panned out so far but I’d still like to do it.
Your point about specifications makes sense. I think that was a point that made some aspects of maths harder for me— contextual differences in notation all over the place. Thanks for the input.
I can relate to that, because it's something that's bothered me about most texts. The preliminary chapter in the Gardner ed. addresses this to an extent, I believe.
On the topic of infinitesmals, Gardner addresses their existence/utility as historically controversial. By providing both sides of the argument I was more able to understand how both parties were correct, and it has given me better insight as to why and when small numbers are ghosts and when they corporealise.
Unable to look inside to see the textbook content, I went to the Bartlett Publishing website where it’s listed along with alternative textbooks debunking evolutionary theory and fossil forensics. Sorry, I’ll pass.
You can read the two arxiv papers if you like. The book doesn’t really have any technical content that isn’t in those; just more elaboration and exercises pitched at students coming to the subject for the first time.
I certainly wouldn’t recommend this as a sole introductory textbook; it’s obviously produced on a shoestring budget (a LaTeX file with the default template sent over to a print shop) and is somewhat limited in many aspects compared to established calculus textbooks. But the pedagogical idea of focusing on differentials seems sound. I haven’t ever taught an introductory calculus course, but I think it seems entirely plausible that this approach would save some confusion for many students (this is something which could be tested empirically, if any math-ed researcher has the time and budget for a study).
It’s not clear why the guy’s ideas about evolutionary theory (which I know nothing about) have anything to do with his ideas about derivatives. From what I understand he’s a computer programmer and math teacher without extensive training in biology; I wouldn’t expect him to have any insight into evolutionary theory.
I'm reading that now! It was suggested on another HN thread a few months ago. It's really given me a new appreciation of calculus. I run a data science department but there are parts of calculus that never really made sense before.
Thank you for the links. I always did well with math classes in school, but my retention was always so so after passing the final. Will give this a try for a more lasting understanding of the subject!
I sort of dislike this kind of "philosophical" introduction to calculus. Maybe I don't have the spirit of an artist.
The best way to start with calculus is the one by Gilbert Strang, who explains everything on the first page of his book (and the rest of the book are "just" examples).
The first half of the first page shows a drawing of the speedometer and odometer of a car and it explains what they are, and that they are not independent but related in a special way. On the second half of the first page it says that differential calculus is the task of computing the speed from the distance, and integral calculus is the task of computing the distance from the speed. Then it says a lovely sentence "this is not an analogy, this is the real deal and we have already started with the subject, and this is actually all there is to it". Then in the rest of the page it explains in a couple of sentences how can you compute speeds from distances and vice-versa, and why you need a constant of integration, and so on. It also proves the fundamental theorem of calculus. The rest of the book consists in concrete examples and a few more constructions, up to Taylor series.
I can't imagine someone with an arts background getting anything at all out of this introduction. The Gilbert Strang approach is much clearer - although it's still not obvious why an artist would ever need calculus.
In fact there are some applications in generative design and 3D animation, but it's still low on the list of essential art skills. And if you really need those effects - and know enough math to understand how to code them - you can copy code from a cookbook without having to derive it from first principles.
I don’t believe that anybody can really learn any math (much less calculus) by just reading about it without doing a lot of practice problems. It’s nice of this guy to put this together and all, and I guess it could pique somebody’s curiosity, but they’ll really need to pick up a good book with a LOT of sample problems to understand it any better than they understood it before they read this introduction.
These types of tutorials are pretty popular among VFX artists who may suddenly need to brush up on complex math, but don’t work with it consistently. Knowing the fundamentals of calculus comes in very handy when suddenly you’re tasked with working in Houdini for six months and have never touched the software before.
Different people learn differently. Very few people learn all they know from a single source. Therefore, having many sources presenting a same subject in many different ways is a wonderful thing.
Gilbert Strang’s book is wonderful! Some will find it to be the “best book” to learn calculus, others won’t. No need to poo-poo others’ work just to value signal. (I’m sure the authors would appreciate any constructive feedback though!)
But I'm not "poo-pooing" Kleitman's book! It's just that after reading the beginning I did not find it very engaging personally. It starts with numbers, and properties of real and complex numbers. The definite integral is not defined until half the book (while it is a simpler concept than the derivative). I prefer to start with calculus, and the relationship between integrals and derivatives right away. And then, much later, when you need some crucial property of real numbers, introduce it as required. You can do a great deal of calculus without needing to know the definition of real number!
I hesitate to agree with you because this (website) is a nice interactive presentation.
But it depends on the audience and more the depth of result required.
If I was to introduce calculus to non-mathematicians, I'd use almost no maths at at. Not even symbols! I'd use area and slope, with specific examples - probably just distance, velocity and acceleration.
I think you'd be able to reveal the magic relationship between slope and area with just that.
Yeah, I love the one that starts "All of linear algebra deals with the four following problems: Ax=b (where A either is square, tall rectangular or wide rectangular) and Ax=lx where A is square."
This is the exact example that made calculus “click” for me. Before that math was a chore and after this example it all seemed so much more useful. I wish I had seen it in junior high instead of college.
I found the best way to truly understand and enjoy calculus was to learn it as it was historically developed https://youtu.be/HRD9X-2Bmdw and then learning about the problems they ran into and how we ended up with modern Calculus after Euler and Lagrange tried to correct these problems https://youtu.be/fCZ8jJCVinU (these lectures cover Stillwell's book Mathematics and it's History)
Interesting in the second lecture is how Australia does 'photo radar' on one stretch of highway, where it records you going through a gate at one point, then many kilometers away records you again from another gate, then establishes your average velocity between the two gates using Calculus and sends you a ticket if your calculated average shows you must have exceeded the speed limit during some point between the gates.
> best way to truly understand and enjoy calculus was to learn it as it was historically developed
In fact, I'd argue it's the best way to understand almost anything, especially in math. Many topics I found somewhat confusing at school or university or whatever got really simple once I learned about their history. Little by little I come to feel that most of great inventions or discoveries made by people we regard as geniuses are often brilliant at how clear, beautiful and somewhat unexpected the solution was, but it's actually very rarely complicated and usually seems like the most natural thing in the world, when told about how Fourier/Laplace/Leibniz/etc discovered it, and not hidden behind standard school math curriculum.
That's a part of why I love 3Blue1Brown videos so much, and why I love Morris Kline books. And it always makes me kind of sad feeling how much time I wasted trying to come to terms with something that always was just unnatural explanation.
Morris Kline's book "Calculus: An Intuitive and Practical Approach" is still in print. I prefer the paper version because the Kindle version has many formatting problems.
I swear one day I'm going to write a WebExtension whose sole purpose is to auto pin comments about 3B1B's videos on calculus and linear algebra HN threads.
Seems like a handy resource but seems the navigation is a bit annoying to deal with. Quick hack with `wget --mirror` + `pandoc` for joining the documents, I got the following single-page layout: https://cloudflare-ipfs.com/ipfs/QmZetPzAP5bC3uXVmNUKU2DU22L...
(I take no responsability for any formatting errors. Math seems to render correctly, but probably some links are broken)
The tone from the beginning (at least in Chapter 0) struck me as what I might class "fey wankery". The wry style gets into the way of communication. Is there some misapprehension amongst STEM explainers that people who are less familiar with maths need to be treated like skittish, possibly mentally-challenged deer?
Or is it a nod to Socratic dialogue except one of two is a moron?
Then Chapter One leaps into rational numbers, set theory and fractions and commits the usual errors: asymptotic curve into complexity, no history, no vivid metaphorical visualisations and (a personal peeve) no historical or causal explanations.
Chapter One literally starts with a question ("What are numbers?") and does not ever answer it. You can count them, some of them are natural (what does that mean?), you can perform operations on them. But what are they, Professor Kleitman? An abstract concept that can be derived from our ability to discern a first order similarity between both similar and dissimilar objects? Is it too vast a question for a introduction to calculus? Then don't use the question in your section header.
Operations? "There are addition, subtraction, multiplication and division." Why? Why isn't there redition? What's redition? I don't know, it's an operation I made up, but you seem to have plucked four relations between natural numbers from the air and asked me to assume that's acceptable.
The text continues in similar, tedious terms, walking us through the basics on stepping stones of assumption and unearned trust.
Educating beginners and "artists" doesn't require you to speak to us like five year olds. Take a leaf from Feynman or Fuller's books and talk to us like adults but do the work in creating vibrant metaphor.
Just my opinion and apologies to Professor Kleitman.
To end on a positive note, the opening pages of Gilbert Strang's book, linked by another poster here, were much more effective in conceptualising the need for and use of calculus by anchoring it in a strong metaphorical example.
In barely a page or two, I understand a relationship between velocity and distance - and that time is involved - and how that relationship can be geometrically envisioned.
Further, I'm already pondering how one might deal with a more realistic car journey with a variable velocity, something that calculus will address later on.
The difference? No infantilisation; instead a clear and applicable metaphorical example.
This question "What are numbers?" I find way more fruitful to contemplate than any answer I have so far found. And why should we be adults anyway? Adults already know all the answers. Ask this question like a five year old would.
We can determine the linear function which takes value f(a) at a and f(b) at b by the following formula:
x-b x-a
f(x) = f(a) --- + f(b) ---
a-b b-a
The first term is 0 when x is b and is f(a) when x is a, while the second term is 0 when x is a and is f(b) when x is b. The sum of the two is therefore f(a) when x is a and f(b) when x is b. And it is a linear function. Linear functions have a term that is x multiplied by some constant, and may also have a constant term as well.
Umm... whut?
This has me completely lost. I don't see how this can be geared towards beginners and artists.
This can be used to linearly blend any two things, as for t in between 0 and 1, we'll get part of f(a) and part of f(b).
Now, if we want to use x instead of t, where x goes from a to b, we need to convert:
x = a <=> t = 0
x = b <=> t = 1
The distance that x has travelled from a is (x-a). The total distance from a to b is (b-a). t is the proportion of that total distance, so t is the distance travelled over the total distance:
x-a
t = ---
b-a
Now (renaming g to f as we're changing parameterisation* )
> In mathematics, the term linear function refers to two distinct but related notions:
> - In calculus and related areas, a linear function is [...] a polynomial function of degree one or zero.
> - In linear algebra, mathematical analysis, and functional analysis, a linear function is a linear map.
I personally prefer the terminology "linear function" for the former definition (polynomials of degree 1 or 0), and "linear transform" for the latter definition (a linear map).
I'm trying to understand why there's f(a) and f(b), why is b being subtracted from x, why it's divided by a-b, why there's a similar thing for f(a), why the two are being added, why the zero values are important, and what the whole thing actually means.
It's demonstrating how to find the line that passes through two points x=a, x=b on the function y=f(x).
Without having read the actual page, I assume he's going to move a and b close to eachother to approximate the tangent of the function at a point.
edit: In higher level math, you use circles/spheres or parabolas/paraboloids to approximate functions, but in high school level calculus you stick to using a straight line to approximate a function
Unfortunately, he doesn't do that. There's an even more complicated formula under that which is supposed to be related to the slope of some line and a ratio between b and a (I think?) and then I'm guessing he reduces it, but doesn't explain how he did it.
I've tried learning calculus 3-4 times during my life, using materials for an "absolute beginner", and this has always been my experience, as if one were teaching programming by going from "This is a variable. It can store data." directly to "A monad is a monoid in the category of endofunctors." To this day I have no idea what calculus even is, or what it's for.
From reading a few excerpts only, I am reminded of a lot of popular science writing, which too often adopts a condescending casual tone full of flimsy, inaccurate metaphors and chatty asides, and which too often omits simple statements of the details of the subject at hand (often less complex than the verbiage used for the adulterated version).
The writing becomes so far removed from the subject, that while those already familiar with it will be able to guess what the author was getting at, those who are not will remain in the dark, leading to a zero increase in the reader's knowledge in each case. Despite being in the former category with respect to basic calculus, I can't help but feel a sense of umbrage on behalf of the latter.
Poking around this, it looks like there's a well-written introduction to calculus buried inside, but the navigation is terrible. They should come up with a better way to present this. A single long scrollable page, as clunky as that might be, would be far superior to this.
A point as minor as they come, but I really am not a fan of that embedded chapter structure. It's absolutely ok to have multiple sub-topics in a topic (that's why TOC exists in the first place!), but it also makes it so much more enjoyable to read, when you can simply go through text linearly, without worrying about what link on what level you clicked just before that.
Despite the title, I struggle to see who exactly this was written for.
The Javascript widgets are quite neat. Fiddling around with parameters, and receiving instant graphical feedback, is great for developing an intuitive understanding. But I find that the writing has several issues that probably make it less clear to beginners than your average calculus textbook:
The writing is very brief, with typical mathematical terseness. This is usually a good thing, since it lets you be very exact in your definitions. But to be accessible to a beginner, this writing needs to be backed up with concrete examples, and preferably a picture or two. In most cases I find that it isn't, so the reader needs to internalize a lot of things on their own before being able to move on.
The text is also sparse on showing its work. A positive example is near the bottom of 5.1, showing how to derive the quotient rule, but most of the time it looks more like example 1 in 6.1, with a lot of information given inline before actually applying the mentioned operations to the expression:
> Suppose we substitute the function g which has values given by g(x) = x² + 1 into the function f which takes values f(x) = x³ − 3.
> The substituted function f(g) has values f(g(x)) = (x² + 1)³ − 3.
> Let us compute the derivative of this function. The derivative of f(s) with respect to s is 3s², while the derivative of g(x) with respect to x is 2x.
> If we set s = g(x) which is x² + 1, and take the product of these two we get:
> [expression]
This type of mental expression manipulation is fine for someone that's had practice with it, but probably not for a beginner, who would gain a lot from having these kinds of things written out in a more structured way.
In these aspects, I find the text less clear than the calculus textbook i used at university (which was not directed toward beginners or artists).
I agree a lot with prvc's comment[0] about being able to fill in the gaps, in that the terseness and handwaviness can make this look like a beginner-friendly version to someone that already is familiar with the subject, but I don't think changing tone is enough to make something beginner-friendly.
Came across this just when I was looking to deep dive into calculus, typed lambda calculus, differentiation, etc. I somehow have a hunch learning these, will help me down my road on language design and implementing compilers.
Only midway, but hands down this is the best material I came across on the subject. As light-hearted as much as detailed.
Update: Thank you, all. I guess the dots would never connect down my road. But I'm midway and the topic's still interesting enough for me to keep going.
Integral and differential calculus is not useful at all [1]. Compilers are not very computation-heavy, and where they do require grinding through math, the interesting problems are discrete math rather than continuous variable problems, which means that integral and differential calculus very rarely offer insights.
The math you usually want is algebra. Even then, the elementary algebra you learned in high school (or earlier) is usually sufficient. More complex domains in math are usually used for theoretical analysis and are less common in practice. Many dataflow optimizations can be described as monotonic functions over a semilattice, but they are rarely implemented as such. Type systems in functional languages can be understood more richly as applied category theory. Linear algebra is useful when multidimensional loop nests come into play.
[1] Unless maybe you're building a neural network compiler.
Just out of curiosity, why do you need to know differential calculus to write a compiler, and what does it have to do with lambda calculus? Learning it is a great thing to do, and it will certainly help you in other ways. But that kind of programming usually has little in common with the "mathematical analysis" kind of calculus (unless you're specifically interested in differentiable programming, which is actually kind of big these days). Anyway, happy studies!
Oh, I have this bad habit of going down the rabbit hole of fundamentals if I feel I lack the basics of a concept. While I still haven't grasped the need for calculus in language design, I've come across pieces on how Lisp is based on lambda calculus vs Haskell based on typed lambda calculus. Also, that any problem that can be solved with turing machine, can be solved by lambda calculus as well. While not much about compilers, I guess it will solidify my understanding of language designs?
I'm not sure. I'll rather learn and hope the dots will connect later than ignore the subject altogether.
I guess... Well, maybe I'm misunderstanding what you're saying, but I just want to be clear that you know that the branch of math most people call "calculus" is almost completely unrelated to the branches of math called "lambda calculus", and "typed lambda calculus." The former, which this link is about, has very little to do with language design. In fact, there are all different kinds of math called "X calculus" and "calculus of Y", each of which has nothing to do with the other.
Just "calculus" is short for "differential and integral calculus," which is about the local behavior of functions and the measure of objects in continuous spaces. Many people study this in high school (and earlier), since it's the underlying mathematical technology for much of the physical sciences. "Lambda calculus" is a logical system for manipulating symbols representing functions, and is indeed equivalent to Turing's symbol-manipulation system. So they both have to do with functions, but that's about where the similarity ends!
I hope I'm not coming off as brusque or anything, I just don't want you to spend loads of time on something that isn't what you think it is! That said, lots of programmers are surprisingly weak in (differential/integral) calculus, and could use a good dose of it anyway :)
Just to clarify, you do realise that lambda calculus is an entirely different field to the calculus of mathematical analysis? They really have very little in common apart from the name.
Sorry if you already knew this, but it seemed like you were making a connection that wasn't there.
Let's clarify here: "Calculus"[0] and the lambda calculus have almost nothing to do with one another. They just share a word in their name, like the Department of Mathematics and the Fire Department. There are several different things called "the X calculus".
[0]:Could be called "numerical calculus" or "calculus of integration and differentiation", perhaps.
A while back I was trying to under why some things are called algebras and others calculus. The answer I got is that the term calculus is related to the introduction/reduction of variables or symbols, whereas an algebra is manipulation of existing symbols.
Calculus means "stone" in Latin. I suspect the relationship between stones and counting is why we use it so much (though I'm not an expert on how/why Romans used stones).
Good clarification, but I wouldn't say almost nothing. All are formal symbolic reasoning techniques.
For example, The predicate calculus is invaluable for writing correct programs with mind-bogglingly large input domains, such as the set of all C++ programs.
Lambda calculus nor regular calculus are going to help you write a compiler. And certainly regular calculus isn’t going to help you design a language...
If you're writing a compiler for a functional language, lambda calculus is quite useful, and even necessary is some cases (understanding typed lambda calculus is pretty fundamental to statically typed functional languages from an implementation point of view).
This is an excellent book, which is full with historical insights, references to people I didn't know about (Sophie Germain, Mary Cartwright, Sofia Kovalevskaya), and emphasizes intuition and applications rather than proofs.
I took 2 semesters of Calculus in high school. It was a "requirement" to get college AP. I was convinced that I would have failed at college if I didn't take it. I have yet to use it in my career. A waste of time. A 2 semester class of drawing or Art would have been way more useful.
Why is calculus held in such high esteem in college prep?
Most students can safely skipt calculus. A 1-day general introduction would be enough for 90%+ of college students.
90% of college students aren't stem majors and so what you're saying is almost tautology. for stem there is literally no branch that doesn't use calculus in at least some way.
As someone with extreme dyslexia and dyscalculia, this is fascinating. Because numbers and words are images for me, my recall and then contextual association of them is very slow, building different frameworks to understand them is super important and I wish there was a greater emphasis on different memory styles in learning outside of rote.
This site undermines its own intent of providing a friendly and accessible introduction to calculus through poor and erroneous grammar. An editor could make it very useful. Some examples:
“OK, but how does calculus models change?” error: subject-verb number agreement. Try “model”.
“The fundamental idea of calculus is to study change by studying ‘instantaneous’ change, by which we mean changes over tiny intervals of time.”
So, a change is a changes? Stay with the singular: “by which we mean a change over a tiny interval of time.”
“It turns out that such [tiny] changes tend to be lots simpler than changes over finite intervals of time.”
Now we have a logical problem. Isn’t the set of tiny intervals of time a subset of finite intervals of time? In ordinary usage, “tiny” is finite, surely; a tiny thing is not infinite. So how can a tiny-time change be simpler than a finite-time change, when it is itself a finite-time change?
Errors of this sort lost my faith early on. I hope they’re corrected, because the promise of the article is appealing.
I’m my opinion, your comment is very self serving and reminds me of a quote from Theodore Roosevelt:
“It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.”
If you really cared about the subject you could have sent the creator/department a message with your constructive criticism. Instead, you posted on the form how a few errors invalidates the entire work that took significant amount of time.
Khan Academy is a better resource for being introduced to calculus.
The title of the course is unfortunately perpetuating the mistaken stereotype that artists are not mathematically inclined. The incoming class my freshman year of the art school at the university I attended had the highest median Math SAT score of any of the other schools (i.e engineering, science) in the university.
For linear algebra, the YouTube channel 3blue1brown has a truly excellent series on linear algebra; I found it was the best intro to the subject avaliable.
This comment is like a red rag to a bull for me, because why wouldn't some artists want a basic grasp of calculus?
Bear in mind that a secondary education, for most people, involves learning the basics of calculus. You might be thinking that all artists are bad at math or too stupid to grasp calculus. You might be thinking that there are no applications of calculus in art. I don't know. But given the role artists are supposed to play in society, it seems pretty reasonable to expect that some of them will have an interest in learning calculus to a high school level like a significant fraction of the non-specialist public.
A concrete example of this might be an artist who is interested in systems theory from an ecological perspective. Once you start talking about stocks and flows (of fish or minerals or greenhouse gases) then calculus and differential equations are really the next thing to tackle.
I interpreted the vague "Artists?" very differently from you. I thought it was saying "What about this makes it appropriate for artists? It is not showing off the beauty and simplicity of Calculus."
> You might be thinking that all artists are bad at math or too stupid to grasp calculus
I think labeling it as "for Beginners and Artists" is actually doing some of that. I certainly would not make a book labeled for artists unless I thought it would be particularly good for artists and from what I have seen this book does not touch on creativity or beauty. Therefore it must be labeled as for Artists because they can't learn calculus from existing books which might imply
"all artists are bad at math or too stupid to grasp calculus".
I've spent many years "pretending to understand" calculus, but things I remember gnawing at me, like limits & infinitesmals, are accompanied with context and history such that you can finally put yourself into the conversation and understand that my confusion is simply due to only getting a fraction of the story.
You can read the full text for free here [1]
[0] https://openlibrary.org/books/OL351037M/Calculus_made_easy
[1] http://calculusmadeeasy.org/