We really need to quit forcing programmers to take calculus. Even set theory is a minority of us. Nobody actually needs order of magnitude either, so long as you can tell one thing is bigger than another which is about one day of learning after you cover polynomials.
The vast majority of us never use more than basic algebra. Counting, multiplying and dividing money, and estimating effort in story points (and maybe dollars as you get seniority) are the big math items.
I'm horrible at calculus, a fault that cost me around $30,000 in makeup classes and a delayed graduation as I kept losing scholarships.
Yet I have never, and will never, use Calculus in my field. This was told to me by the career coach in my intro class, who said almost none of you will ever use Calculus, yet my school still made everyone take up to Calculus 4. Who the hell is finding the area under three-dimensional curves without relying on a library? Madness.
I’m inherently biased because I studied applied mathematics, but if you advocate removing mid-level mathematics (calculus, set theory, much of college algebra) out of a computer science degree, what you have left is a few years of courses about ’scripting’ and practical use of particular specialised tools. One cannot aim to become some kind of ’engineer’ or technical person without at least a grasp of these mathematical techniques, even it they’re only learnt for the sake of understanding and then forever set aside. They enrich your thinking; without enrichment your mind remains... impoverished.
Beyond that, computer science is an offshoot of mathematics (Turing laboured on Computability, Boole on relation algebra, Von Neumann on finite difference methods). To rid computer science of mathematics makes about as much sense as stripping psychology out of counselling,
You have a much too ’practical’ view. What you have in mind is not a graduate degree in computer science but some kind of... vocational training for programmers. Which is OK, if that’s what you want, and provided you leave the faculty of computer science intact for those who wish to study the abstract field and make progress therein.
I agree. Too many people conflate a computer science degree with the ability to code. The real purpose of a computer science degree is to learn to model, analyze, and solve problems in a computational framework. Mathematics is the language used to do that, so math is a necessary part of such an education.
When it comes to sitting down and writing software, such a mathematical computer science education provides a clear benefit. The more math you know, the more problems you can formulate and solve with a computer. If the job is cranking out web pages, a mathematical background will increase efficiency because you'll recognize the problems at hand fit into familiar formalisms with familiar solutions. Moreover, if you run into a hard problem, a mathematical background will provide the necessary tools to work through it in a principled manner.
I don't think you can properly learn probabilities without understanding a degree of calculus. And I don't think you can have a complete scientific education without a measure of probabilities and statistics.
On the other hands, I think you don't need computer science to be a programmer, and a lot of us have suffered and will suffer from the coflation of these two fields. Many "computer science" curricula are actually programming. Many people who want to learn programming enroll into computer science because the difference is not clear at all. This was the case for me, and I was lucky to actually like CS, but it was not what I thought I had signed up for.
> I don't think you can properly learn probabilities without understanding a degree of calculus. And I don't think you can have a complete scientific education without a measure of probabilities and statistics.
I don't think the parent said zero calc or statistics (rereading the parent did say no calc which I disagree with). I think the complaint stems from the fact that many CS curriculums started, and often still exist in the math department. Should a CS student take calc classes up to level 4 or stop at level 2 and instead have more advanced courses around the computer part of the science?
Just because you don't understand it doesn't mean it isn't useful. For a trivial example, I often use calculus to measure rate of change of telemetry of production systems, and that's just as a simple developer measuring performance of things I'm responsible for.
I assume by "order of magnitude" you're talking about big-O and friends... I use that math all the time when I deal with things at scale. If you're doing anything that's not a CRUD app with a trivial amount of users, you quickly run into math.
This is one the reasons that my undergrad degree is in 'Computer Information Systems' instead of 'Computer Science'. The number of CS classes between the two degrees was nearly identical, but the high end math (anything beyond calc2 iirc) was replaced with business classes (accounting, economics, finance, etc...) and statistics.
At the time I was in school there was a real worry of outsourcing all programming, so I didn't want to be 'just a programmer.' I hoped in the worst case, if all the jobs were outsourced I could be in charge of a remote team. Turns out the outsourcing didn't happen quite to that level (although at one point in the past I was in charge of a teams on the other side of the world), but the business courses have been very valuable over my entire career so far.
I'm seeing this type of split more often in CS, and I think it's great. CS + business or CS + art or CS + media, CS + whatever I think is the future. Programming alone is great (someone has to write the libraries we all use ;) ), but take a competent programmer, add another skill they are passionate about and you have a real force.
I did CIS too so I could avoid math but now find myself regretting it. I always enjoyed math but my middle school teachers were more concerned about yearbook planning than teaching and I always suffered from that.
I’ve actually recently started exploring Linear Algebra on my own. So seeing this list is helpful, as I’d like to explore Calculus too.
I agree with your point, though. My courses in Economics and Accounting have proven valuable to my career so far.
The need to take calculus and other math related classes in an engineering or science field is not to know how to solve integrals or differential equations in your mind for a normal 9to5 job, but it's for something extremely important in computer science an engineering in general: abstraction; the ability to, when faced with a [new] problem, hide the details away for a moment and concentrate on the important parts, recognize patterns, know which tools to use, etc. Then you can go bringing back the details when you need them, to form a solution.
You'd certainly go crazy if you start to think in all the little details as soon as you face a problem, and probably never deliver a solution on the time needed.
It turns out that mathematics are very good at, gradually, giving you new levels of abstraction to solve new problems, or old problems in new ways, etc. The important is not to memorize formulas (that's what books and wikipedia are for), but to know which one to use, and when, their uses, etc.
> We really need to quit forcing programmers to take calculus.
I used Calculus one day to solve a random business problem.
I noticed the pattern was a Taylor Series.
It had been over 10 years since I learned it, but I immediately recognized the pattern.
I solved it via brute force on Excel. It took over 50 pages of Excel calculations to reach the answer. (About 2500 lines of calculations, which is easy to do in Excel.)
And once I noticed the pattern, I was able to turn it into a formula.
And now, I had a one-liner formula, that I was able to use to solve my business problem. Then, I plugged this formula into Excel too, and now I had a little Excel app that could get the answer for me.
Did this make me any money? No. Did this directly make the business any money? No. But, it did help the business by now having a mathematical formula that they can use to solve that specific question, if another customer asks for it in the future, specifically during the initial sales process.
And all this, because I had been forced to study Calculus in college.
Funny, because in my field (crypto) I encounter calculus directly fairly regularly -- despite the fact that crypto looks like it is all about discrete math. I also see calculus being used in other fields that I have had to interact with; AI immediately comes to mind (stochastic gradient descent), and various fields related to signal processing (including computer vision).
Really though, the immediate applicability is beside the point. We are not talking about learning to program, we are talking about learning computer science. There is value in having a breadth of understanding in basic math topics (yes, calculus is a basic topic; if you want something advanced you need to look at real analysis).
"Programmers", sure. But for a professional, degreed practitioners let's have some proper education please. What the heck is so hard about a bit of 17th C. mathematics anyway? Or are we talking about more advanced topics like "Analysis" (19th C.)? Either way, I don't want a dumbed down low wage profession. I'm looking for parity with at least doctors, accountants and high school science teachers..
By the way, I am wondering if you had some bad teachers who themselves failed to fully understand calculus. Good teachers make all the difference.
I totally agree. We are slowly deluding our profession by reducing it down to a few online courses. The vast majority of college is not about learning a specific topic. It's about being given a wide range of problems to solve in very different and complex topics that stretch your abilities.
I've been amazed at how many programmers I've interacted with in the past two decades who cannot operate outside the one, small subject they currently know. It's one thing to say you can't work on scientific modeling because you couldn't do Calc, but when I see front end developers who say "nope I can't do back end ever" I realize that they were never challenged and they will be the last person I want on my team. I want to have an employee who says "I don't know that topics but I'm going to learn it and solve this problem!" Dumbing down our profession doesn't help us.
> We really need to quit forcing programmers to take calculus. ...
This is a resource to learn Computer Science - not computer programming. Calculus and many other advanced mathematics topics are very much needed if wishes to learn Computer Science.
To the extent that this is true I doubt that it will create many additional programming jobs for people with advanced math skills. AI/ML is like cryptography, compilers, or programming languages now. You implement the algorithms a few times for proprietary products or open source libraries and then they get reused by millions.
The bulk of the work is still connecting the algorithms to real world data sources, creating interfaces to monitor and view their results, and explaining to non-technical people how to use them.
If anything AI/ML will create more jobs for non-math people who are working on the glue level because the better the tech is behind the scenes the more value it creates and the more potential customers there will be.
I wonder if people would be better off learning an appropriate amount of calculus and statistics by simulation, rather than by derivation and proof. You can learn a lot with a random number generator, or by breaking a function into little increments and adding things up. Doing it this way might also reinforce the students' programming skills, or gently expose non-programmers to a little bit of programming.
I love derivations and proofs, and majored in math, but they are a deterrent to most people, and unrelated to their jobs like you say. I work for a company whose products are particularly mathematical, yet only a tiny handful of programmers (mostly the ones with science backgrounds) deal with the math related stuff.
Most engineers finish college, start their first jobs, and immediately become so busy with CAD and bureaucracy, that they forget all of their math and theory.
I think another problem is: What to do with kids who want to become programmers, and have been told that they must get a college degree in something, but programming doesn't really require 4 years of college study. Many of those kids major in Computer Science, which many people have pointed out, is not the same as programming. But selective admission into Computer Science programs becomes self fulfilling in terms of the demand to get into those programs. I took a different route, which was to major in math and physics. And I ended up doing something other than programming for my career, but that's OK too.
Calculus is useless, until it is not at which moment it turns to incredible help. You don't need to go super deep, bit basics are occasionally quite useful. More importantly, if you learn only what you strictly need right now, your work possibilities will be very limited.
Also, for some reason, I found abstract algebra related to objet oriented design - similar kind of thinking.
This is free course - they won't be in unfortunate 3000$ lock you was. (I think calculus is useful, but I don't think it is worth paying that much money.)
People seem to think that financial software, artificial i telligence, map software, math libraries they use, graphic software etc all emerged due to magic. They did not and all required calculus.
> Nobody actually needs order of magnitude either, so long as you can tell one thing is bigger than another which is about one day of learning after you cover polynomials.
If you're writing code that deals with list-shaped user data, you better believe you need this, because how the code scales with n is a constraint on the business.
Agree with other answers here. But it's useful for just some of the "easier" use-cases. Just throw an example here. Sometimes when proving the upper-bound of a value (runtime, probability of error, expected runtime, etc) it's helpful to use derivatives to show that a function is decreasing/increasing.
Back then, before the Bologna Agreement that brought the Yankee idea of modular courses and transcripts to Europe, professors would say that you wouldn't use half of your degree, but only when you left university you'd know which half you wouldn't use.
Me? I regret that I don't know nearly enough statistics.
The vast majority of us never use more than basic algebra. Counting, multiplying and dividing money, and estimating effort in story points (and maybe dollars as you get seniority) are the big math items.
I'm horrible at calculus, a fault that cost me around $30,000 in makeup classes and a delayed graduation as I kept losing scholarships.
Yet I have never, and will never, use Calculus in my field. This was told to me by the career coach in my intro class, who said almost none of you will ever use Calculus, yet my school still made everyone take up to Calculus 4. Who the hell is finding the area under three-dimensional curves without relying on a library? Madness.