Hacker News new | past | comments | ask | show | jobs | submit login
Path to a free self-taught education in Computer Science (github.com/ossu)
264 points by lainon on Dec 30, 2017 | hide | past | favorite | 66 comments



Seems like a great list.

One criticism: I believe that the majority of programming jobs are software engineering jobs (even if that's not in the job title). Since many are using CS as prep for programming careers, I believe this list may need more emphasis on software engineering and contemporary skills and knowledge in SE.

For example, I saw testing in the advanced section if I remember correctly. But it was one of multiple options I think. Doesn't everyone need some testing knowledge/experience?

Another one is requirements analysis or negotiation. Figuring out _exactly_ what needs to be done, in a way that makes complete sense all the way through, is feasible and high priority, isn't nonsense from a UI or database perspective, and getting the other people to go along with that, is pretty important to most programming jobs I have had.

Another example: back in the good ole' days of CS, large scale code re-use may have been mostly a dream for most popular programming language ecosystems (that weren't Perl). But now, even though dependency reduction still seems much cooler, being able to use real package management systems and select the right modules or components is critical for most jobs. Not sure that type of thing gets quite enough attention in this list.


Do people get testing experience in a CS degree? For me it may be a small part of my software engineering course but that would be it.


In my experience, at my school testing is done to pass a professor's designed test cases. Usually you don't know the professor's test cases in advance.


I'm learning programming on my own. I don't have a formal CS education and cannot afford it. Do I need to pursue all this curriculum to become good at coding? It seems overwhelming, but I'm not against learning and am not in a rush. Currently I'm not a professional programmer. I started answering at Stack Overflow and Reddit a couple months ago and am doing really good. I like C and Python, but I regularly write in the latter. My GitHub repo has some programming exercises, has about 300 stars and 50 forks and it happened within a month's time and I'm releasing some more exercises soon. I had to discontinue my education after 10th grade, but my love for English and math has never faded. I had worked in health information for about 15 years, had a passion for it, excelled in it. My employers never asked about my formal education. They wouldn't believe me if I told them the truth. I quit my previous job for learning programming full time because I like this more than anything now. I want to spend the next 15 years programming and learning math. These two and I are inseparable. Born in a village in a third-world country, I learned English by listening to BBC Learning English and VOA Special English radio stations. As a kid, I loved grammar, syntax, punctuation, vocabulary. This is not my best writing :) I have another handle on HN and it took me about 250 days to reach a 14,000 karma here.


No, you do not need to pursue this whole curriculum to become good at coding.


I knew it. Programming is part of computer science. Now I say CS is part of math :)


I've followed this github project for the past year and a half and although I find the repository full of amazing resources, I still find it unsuitable (for my own needs) since there's limited (to no) feedback -- necessary for learning -- from a dedicated resource (I.e professor ). However , the project maintainer(s) are aware and working on addressing this -- as well as other -- issues; its worth noting that this problem isn't esoteric to this project , but many other (free) online resources .


Join a programming community and ask for feedback on your exercises/projects. Tons of communities out there. Not saying this is a non issue but the fix is pretty simple.


Can anyone explain to me the continued prominence of "Object-Oriented Design" as part of a fundamental computer science curriculum?

It seems to me that the basics of OOP, the kind of things that most programmers need to know about, can be covered in a single lecture or two as part of a general programming or software engineering class. Java-oriented design patterns and general API design seem more appropriate as an advanced/optional offering for students who are interested.

The linked curriculum has 6 weeks dedicated to Object-Oriented Design in the intermediate part of the "core programming" section. It seems like that much time could be spent on something more fundamental.


This is explained by the teaching staff having been there for 25 years and no effective process in place to force them to change the curriculum.

I’m a CS professor. My colleagues and I have complete control of the program. When you see an outdated program it’s the faculty’s fault in general and the leadership’s fault in particular for not forcing the issue.


Not that my opinion matters, I don't have any advanced degree and no experience building curricula. But if I were going to include OO in a curriculum here's how I'd do it:

1. (Intermediate/Core) I'd cover basic OO in Python. I'd demonstrate basic features like instantiation, introspection, properties and methods before moving on. I would not cover defining new classes before I had covered creating a dispatcher with first-class functions in a hash-map/dictionary. I would not cover defining new classes unless I could cover pros and cons of using OO style versus simple functions that accept rich data structures like lists and hash-maps.

2. (Intermediate/Optional) I'd offer Java, C# and(or) Smalltalk classes at the intermediate level, where the goal would be competence with the chosen language's implementation, tooling, ecosystem and common programming idioms. These classes would involve object-oriented principles, given how important the paradigm is to these languages, but it wouldn't be the sole focus of the class. Edit: Also C++ if possible.

3. (Intermediate/Optional) A design patterns practicum. (Visitor, observer, factory, etc.). Just a programming-heavy class where you solve problems using as many of the common design patterns as possible.

4. (Advanced/Optional) A bottom-up class where we'd cover implementation details of object systems in a low-level systems language like C. Perhaps a semester-long project might be to build a working object system in C. The pedagogical goal would be impart an understanding of the kinds of problems that object-oriented programming was invented to solve by relying on first-hand experience rather than dogmatic instruction and hand-waving about the evils of procedural style.

5. (Advanced/Optional) An Object Oriented Design class that would cover issues in designing large-scale systems using object-oriented programming. It would cover issues relating to inheritance, Object-Relational Mapping, and API design. This is where you'd cover UML and that sort of thing.


I think the main issue with structuring a curriculum like this (assuming the classes are sequential) is that it would take 2.5 years of university to cover it all. That's a A LOT of class time to cover only one programming paradigm when the average CS curriculum is already heavy and lacking slots for electives.

Someone would need to start this Sophomore year and not miss a single course until graduation in order to complete this... While that's fine for core classes like Chemistry which have multiple timeslots offered per semester, having such a brittle pathway for a less-taught topic would cause some grief to plan around.


The basics of chess can be taught in 1 hour but learning to be an expert takes many years. I see OOP in the same way.

Most software today is written in an object oriented language, and for large software systems the quality of your object oriented design has a huge impact on how much it costs to maintain and improve over time. I've been writing software for 20 years and I've never had to implement hash table or search algorithm, but almost every day I'm thinking about my software design using object oriented principles. Personally I think a 6-week course is too short.


There's really no amount of classroom learning that will replace 20 years of experience with object-oriented systems. If you're going to put an inexperienced recent graduate in charge of design for your large software system, you deserve what you get. Ideal coupling of data to code is an arbitrarily complex, context-sensitive problem and each individual company might take a different approach. At best, that is the sort of topic best covered in an advanced class for students specifically interested in problems associated with a large codebase. It's not a good fit for introductory-to-intermediate "learn how to program" instruction.

We could debate the importance of teaching search algorithms, too. But it's a different issue. Search algorithms are a solved problem[1]. Algorithms like that, in addition to being mental exercises, programming practice, and preparation for a career in academic research, are usually used to demonstrate complexity analysis and space/time tradeoffs. The space/time tradeoff is a problem you will face in nearly every environment no matter what language or technology you use.

[1] As far as we know... there might still be research on searching algorithms but for the purposes of the vast majority of software engineers, sorting functions aren't something they'll ever need to implement.


You are more likely to get a job fixing existing software than coding something new. Plenty of existing software is already written in an object oriented style. The majority of Java jobs out there would require a reasonable understanding of the paradigm.


I don't think this is sufficient reason to make it a substantial focus of the core, required curriculum. Again, I do not have a problem with OO course offerings or teaching Java, just putting it front and center as a core, fundamental computer science skill.

I would argue that a large codebase at any company is going to be impenetrable to most fresh computer science graduates. Production software architecture almost always has idiosyncracies that they can't teach you in college, no matter what paradigm is used. So you are always going to have to learn on the job in those situations. You'll pick up the necessary object-oriented principles then.

Trying to teach OO programming in to newbies in a classroom environment usually leads to a lot of incidental complexity and dogma that feels like tedious make-work to students because it is. Meanwhile, with appropriate guidance and peer review from a senior programmer or software architect, and an opportunity to observe and participate a real production build process, an OO design is likely to just "click" in a relatively short amount of time.

One of the key points of OO is that it is a discipline. OO involves explicitly coupling data to logic in encapsulated chunks with a clear separation between the API and implementation details. Trying to teach that particular discipline to students who still can't accomplish anything meaningful with code is, I believe, a waste of their time. OO features like public and private methods seem obvious when you actually have an API to maintain, whereas for a student learning the fundamental techniques and technologies it seems like an arbitrary and largely pointless distinction.

Encouraging and/or enforcing coding discipline is the responsibility of senior members of a given team and is most effective when practiced that way.



A suggestion for the section - “Computer Science - Great Readings”. How about “Readings in Computer Architecture”? While I’m aware it’s a tough book to read, and there are already books like “The Elements of Computing System” (NAND2TETRIS) and “Computer Organization & Design : The Hardware Software Interface”, which suffice in getting acquainted with hardware architecture, Readings in Comp Arch will take reader to the transistor level. Any other book suggestion that takes reader to the transistor level would be appreciated.


Computer Organization & Design : The Hardware Software Interface by Hennessy and Patterson is a great book for any coder that is interested in how CPUs actually tick.

However I don’t think knowing the gorey details of transistors is necessary. What they need to be aware of is cache hierarchies and pipelined superscalar processor architecture.

These sophisticated architectural features can be flouted by code that causes them to behave poorly. Such as code that does random accesses in a large buffer causing the probability of the desired data being stored in cache to be low.

The ways in which real computers differ from idealized ones is what CS students need to know.

Also CS students ought to have to learn to read others’ code.


I'd really like to pursue a PhD in Comp Sci (My BS is in CS). But with a family there's just no way. If I kept my full-time job, I might have the money to do it, but I wouldn't have the time. And vice-versa if I quit my job and became a full-time student.

I did go and get an online Masters of sorts (an MBA from WGU), which was definitely do-able while working full-time (and also useless), but it was nowhere near as rigorous and time-consuming as an engineering graduate degree would be :)


Work for any university as an outside contractor in a research programming position, example: https://cmu.taleo.net/careersection/2/jobsearch.ftl?lang=en

You work with PhDs, so can pursue your graduate interests directly from them while being involved in the research, and get paid at the same time albeit not industry standards, prepare for lower salary. Open AI hires developers too and plenty of researchers are there.


It is frustrating that education requires so much wealth. As a young adult I was completely focused on getting a paid position out of my degree and took internships based on pay alone. Despite a professor pushing me to stay onto further education and easily getting a 1st, I couldn’t justify the costs of continuing.

Having seen what poverty can do to someone I have always put securing my future financially over my education, in the hope that one day I can return to university and resume.

And I am one of the lucky ones who could afford / was explained benefits of an education from a young age.


A PhD is paid. Universities near me pay around $30,000+/yr.


Even so, I could never get my family to agree to a ~$100k/yr drop in household income :P


Would you feel your masters would’ve had more value from another institution? Curious because I’m looking to get an MBA (my employer will cover the cost).


Do not get a full time on campus MBA from any programme that is not spoken of as top 10. There are more than ten of those for what that’s worth. If you are already doing a job that people get an MBA to get and your company will pay for you to get an executive MBA that is sensible if a lot of work.

Most of the value of an MBA is in the network, not the knowledge so distance learning and correspondence degrees are worth drastically less. You can get an MBA from the UK’s Open University or Queen Mary’s, University of London for a little over twenty thousand pounds or thirteen thousand tespectively.


Yes, I would agree with this. Though, since I work in govt contracting, having that MBA can sometimes "check a box" when a position has a hard requirement for a certain degree (or degree level). But, yea, if I had it to do over again, I would pick a different Master's degree :)


Thank you!


I'm going to show this to my co worker. He was telling me the other day how he felt trapped at his job and wished he could learn computer science but didn't feel like he'd ever be able to get the money or time to do it.


Tell him to apply to UoP and get an accredited degree tuition free online https://www.uopeople.edu/programs/cs/degrees/computer-scienc... it's $4k or so for 4 years of exams ($100 each). Problem with self-learning courses is most people decide to skip things they believe won't be useful but actually are, which is why they are in the university curriculum, then later they get lost since they didn't do all the pre-reqs. UoP also has TAs


FWIW, I thought you were referring to UofPhoenix until I read the URL. Haven't heard of UoPeople but thanks for sharing. Do you know anyone whose had success with their programs?


If anything it's time that's the hardest to commit, in my opinion. But even setting an hour to two on a weekend aside for something you want to do is something. Good luck to your coworker for whatever path he chooses!


If you don’t mind me asking, what does your co worker do for a living?


He's a hand polisher at the granite shop we work at. He's been doing it for 8 years since he came to Canada from Iran. He needs to get some kind of highschool equivalent but ended up in an arranged marriage with children and had to sponsor her and his parents to come over.

It sucks he's a great guy and one of the hardest working people I know. He also blames his english. I've heard a lot of people with English as a second language use that as an excuse and, like him, it's usually from people who tend to speak and write English better than a lot of native English speakers I've met.

Since then I've offered to show him some of the machine work I do and have tried to explain more about programming the machines to him. He seems excited to learn so I think he'll enjoy stuff he can learn at his own pace.

The funny thing is I've actually learned a lot of things about computer science written by Iranians or other people who English was not their first language. It's a barrier but not as much as i've heard so many people make it out to be especially if you know enough to read, write and communicate.


That is fascinating. I hope your co worker finds success in the future. Thanks for sharing.


As a self-taught programmer I hoped for one of these to explain whatever I needed to wrap my head around "frameworks"--abstractions which make it possible to build the vast diversity of programming development tools built on top of programming languages, while themselves being written in that language.

You can learn about the idioms such as Callbacks and Hooks, but to really grasp how these are organized into frameworks seems difficult for authors. It's easier to just tell you, If you want this effect, Do that. Or, I'm just not getting something about computer science pedagogy; Maybe the domain of programming where I find my interests doesn't teach programming the way I think about learning it. I will dig into this to see what's here.


A framework is kind of like a big template for software applications. A framework gives you a skeleton to work with, where the programmer fleshes out the details in the flexible spots. Except that instead of just some boilerplate code, you have a whole set of libraries and applications.

For a trivial analogy, consider this template for a C program:

    #include <stdio.h>
    #include <stdlib.h>
    
    int main(int argc, char *argv[])
    {
      // Your code here.
      return 0;
    }
It is not technically a "framework" because it's just a few lines in one file, but it's different mostly by magnitude (though also by the fact that it is trivially modifiable). And maybe, if this is an IDE, it puts our cursor right at the line where you're supposed to start putting your own code. Conceptually, what this does is give you a clear place to start developing. It sets up some libraries that you'll probably use defines a main function that accepts arguments and gives you a default return code of 0 (which means success, by convention).

Frameworks essentially do the same thing on a larger scale, and usually with multiple discrete components. The key difference between a library and a framework is that a library is something that the developer imports and invokes, whereas the framework establishes a useful context for a specialized type of development.


CS doesn't really cover software design at the level of programming language idioms like callbacks. The closest it might get is lambda calculus and reasoning about code as graph reduction, where code is just a data structure.

Software engineering might cover design patterns, but even then it's a limited set of techniques with broad application - idioms in practice are tuned to their language and execution environment, because clumsy idioms don't propagate.


Figuring out how to put those things together to make a cohesive library or program is the hardest thing I've had to do on my own. I haven't really found anything that teaches that. There's lots of begginer stuff about syntax and programming concepts. There's some fairly technical abstract stuff out there about algorithms and data structures but not a lot about how to put everything together into a useful program.

The only way I've learned that was to just start making things. Whenever I ran into a problem i'd look up whatever I was having trouble with. It failed a lot. I've had to go back an restart things more than once or endes up giving up on things that were dead ends. Then after a couple years of that I stopped having to quit because i'd completely messed up the idea of what I was doing.

I honestly have no idea if the things I've learned are correct or not but programs I write seem to do what they're supposed to now.

I think the reason why there's not much about how to put programs together is that it's pretty much up to you. There's a million ways to do one thing and most programs have a million things that need to do one thing. You need a good idea of what you would like these things to be. Then you pick one and do it, then another one and another one making sure the first things you made still work with the new ones. If you don't know how to do one of the things either look up and see how other people did it or move on for a bit and come back to it.

A framework, library or program is just input data, either from you or a user, processing of data and output. It's up to you to figure out how to structure these things for the needs of your program.

There is no right way, just whatever way you find that works. You can look at a program like building a machine. The programming language you learned gives you the nuts, bolts and basic materials you need, the algorithms, callbacks and hooks are the different prefabbed parts and you're the mechanic that bolts it all together with a wrench and a whole lotta grease.


This is quite awesome.. I'm gonna use it!


We really need to quit forcing programmers to take calculus. Even set theory is a minority of us. Nobody actually needs order of magnitude either, so long as you can tell one thing is bigger than another which is about one day of learning after you cover polynomials.

The vast majority of us never use more than basic algebra. Counting, multiplying and dividing money, and estimating effort in story points (and maybe dollars as you get seniority) are the big math items.

I'm horrible at calculus, a fault that cost me around $30,000 in makeup classes and a delayed graduation as I kept losing scholarships.

Yet I have never, and will never, use Calculus in my field. This was told to me by the career coach in my intro class, who said almost none of you will ever use Calculus, yet my school still made everyone take up to Calculus 4. Who the hell is finding the area under three-dimensional curves without relying on a library? Madness.


I’m inherently biased because I studied applied mathematics, but if you advocate removing mid-level mathematics (calculus, set theory, much of college algebra) out of a computer science degree, what you have left is a few years of courses about ’scripting’ and practical use of particular specialised tools. One cannot aim to become some kind of ’engineer’ or technical person without at least a grasp of these mathematical techniques, even it they’re only learnt for the sake of understanding and then forever set aside. They enrich your thinking; without enrichment your mind remains... impoverished.

Beyond that, computer science is an offshoot of mathematics (Turing laboured on Computability, Boole on relation algebra, Von Neumann on finite difference methods). To rid computer science of mathematics makes about as much sense as stripping psychology out of counselling,

You have a much too ’practical’ view. What you have in mind is not a graduate degree in computer science but some kind of... vocational training for programmers. Which is OK, if that’s what you want, and provided you leave the faculty of computer science intact for those who wish to study the abstract field and make progress therein.


I agree. Too many people conflate a computer science degree with the ability to code. The real purpose of a computer science degree is to learn to model, analyze, and solve problems in a computational framework. Mathematics is the language used to do that, so math is a necessary part of such an education.

When it comes to sitting down and writing software, such a mathematical computer science education provides a clear benefit. The more math you know, the more problems you can formulate and solve with a computer. If the job is cranking out web pages, a mathematical background will increase efficiency because you'll recognize the problems at hand fit into familiar formalisms with familiar solutions. Moreover, if you run into a hard problem, a mathematical background will provide the necessary tools to work through it in a principled manner.


I don't think you can properly learn probabilities without understanding a degree of calculus. And I don't think you can have a complete scientific education without a measure of probabilities and statistics.

On the other hands, I think you don't need computer science to be a programmer, and a lot of us have suffered and will suffer from the coflation of these two fields. Many "computer science" curricula are actually programming. Many people who want to learn programming enroll into computer science because the difference is not clear at all. This was the case for me, and I was lucky to actually like CS, but it was not what I thought I had signed up for.


> I don't think you can properly learn probabilities without understanding a degree of calculus. And I don't think you can have a complete scientific education without a measure of probabilities and statistics.

I don't think the parent said zero calc or statistics (rereading the parent did say no calc which I disagree with). I think the complaint stems from the fact that many CS curriculums started, and often still exist in the math department. Should a CS student take calc classes up to level 4 or stop at level 2 and instead have more advanced courses around the computer part of the science?


Just because you don't understand it doesn't mean it isn't useful. For a trivial example, I often use calculus to measure rate of change of telemetry of production systems, and that's just as a simple developer measuring performance of things I'm responsible for.

I assume by "order of magnitude" you're talking about big-O and friends... I use that math all the time when I deal with things at scale. If you're doing anything that's not a CRUD app with a trivial amount of users, you quickly run into math.


This is one the reasons that my undergrad degree is in 'Computer Information Systems' instead of 'Computer Science'. The number of CS classes between the two degrees was nearly identical, but the high end math (anything beyond calc2 iirc) was replaced with business classes (accounting, economics, finance, etc...) and statistics.

At the time I was in school there was a real worry of outsourcing all programming, so I didn't want to be 'just a programmer.' I hoped in the worst case, if all the jobs were outsourced I could be in charge of a remote team. Turns out the outsourcing didn't happen quite to that level (although at one point in the past I was in charge of a teams on the other side of the world), but the business courses have been very valuable over my entire career so far.

I'm seeing this type of split more often in CS, and I think it's great. CS + business or CS + art or CS + media, CS + whatever I think is the future. Programming alone is great (someone has to write the libraries we all use ;) ), but take a competent programmer, add another skill they are passionate about and you have a real force.


I did CIS too so I could avoid math but now find myself regretting it. I always enjoyed math but my middle school teachers were more concerned about yearbook planning than teaching and I always suffered from that.

I’ve actually recently started exploring Linear Algebra on my own. So seeing this list is helpful, as I’d like to explore Calculus too.

I agree with your point, though. My courses in Economics and Accounting have proven valuable to my career so far.


The need to take calculus and other math related classes in an engineering or science field is not to know how to solve integrals or differential equations in your mind for a normal 9to5 job, but it's for something extremely important in computer science an engineering in general: abstraction; the ability to, when faced with a [new] problem, hide the details away for a moment and concentrate on the important parts, recognize patterns, know which tools to use, etc. Then you can go bringing back the details when you need them, to form a solution.

You'd certainly go crazy if you start to think in all the little details as soon as you face a problem, and probably never deliver a solution on the time needed.

It turns out that mathematics are very good at, gradually, giving you new levels of abstraction to solve new problems, or old problems in new ways, etc. The important is not to memorize formulas (that's what books and wikipedia are for), but to know which one to use, and when, their uses, etc.


> We really need to quit forcing programmers to take calculus.

I used Calculus one day to solve a random business problem.

I noticed the pattern was a Taylor Series.

It had been over 10 years since I learned it, but I immediately recognized the pattern.

I solved it via brute force on Excel. It took over 50 pages of Excel calculations to reach the answer. (About 2500 lines of calculations, which is easy to do in Excel.)

And once I noticed the pattern, I was able to turn it into a formula.

And now, I had a one-liner formula, that I was able to use to solve my business problem. Then, I plugged this formula into Excel too, and now I had a little Excel app that could get the answer for me.

Did this make me any money? No. Did this directly make the business any money? No. But, it did help the business by now having a mathematical formula that they can use to solve that specific question, if another customer asks for it in the future, specifically during the initial sales process.

And all this, because I had been forced to study Calculus in college.


Just one time in 10 years?


Funny, because in my field (crypto) I encounter calculus directly fairly regularly -- despite the fact that crypto looks like it is all about discrete math. I also see calculus being used in other fields that I have had to interact with; AI immediately comes to mind (stochastic gradient descent), and various fields related to signal processing (including computer vision).

Really though, the immediate applicability is beside the point. We are not talking about learning to program, we are talking about learning computer science. There is value in having a breadth of understanding in basic math topics (yes, calculus is a basic topic; if you want something advanced you need to look at real analysis).


"Programmers", sure. But for a professional, degreed practitioners let's have some proper education please. What the heck is so hard about a bit of 17th C. mathematics anyway? Or are we talking about more advanced topics like "Analysis" (19th C.)? Either way, I don't want a dumbed down low wage profession. I'm looking for parity with at least doctors, accountants and high school science teachers..

By the way, I am wondering if you had some bad teachers who themselves failed to fully understand calculus. Good teachers make all the difference.


I totally agree. We are slowly deluding our profession by reducing it down to a few online courses. The vast majority of college is not about learning a specific topic. It's about being given a wide range of problems to solve in very different and complex topics that stretch your abilities.

I've been amazed at how many programmers I've interacted with in the past two decades who cannot operate outside the one, small subject they currently know. It's one thing to say you can't work on scientific modeling because you couldn't do Calc, but when I see front end developers who say "nope I can't do back end ever" I realize that they were never challenged and they will be the last person I want on my team. I want to have an employee who says "I don't know that topics but I'm going to learn it and solve this problem!" Dumbing down our profession doesn't help us.


> We really need to quit forcing programmers to take calculus. ...

This is a resource to learn Computer Science - not computer programming. Calculus and many other advanced mathematics topics are very much needed if wishes to learn Computer Science.


I think programming will move away from the standard business stuff as things become automated.

A.I, machine learning and distributed systems will become more in demand to replace it.


To the extent that this is true I doubt that it will create many additional programming jobs for people with advanced math skills. AI/ML is like cryptography, compilers, or programming languages now. You implement the algorithms a few times for proprietary products or open source libraries and then they get reused by millions.

The bulk of the work is still connecting the algorithms to real world data sources, creating interfaces to monitor and view their results, and explaining to non-technical people how to use them.

If anything AI/ML will create more jobs for non-math people who are working on the glue level because the better the tech is behind the scenes the more value it creates and the more potential customers there will be.


I agree. I'm really glad I had to take Calc 1 since it taught me about a whole arsenal of problem solving techniques that I didn't know existed.

Beyond that, I wish that instead of Calc 2 and 3 they would have had more heavy programming courses.


Like what? As a person who never took any Calculus, what did I miss out on with Calc 1?


I wonder if people would be better off learning an appropriate amount of calculus and statistics by simulation, rather than by derivation and proof. You can learn a lot with a random number generator, or by breaking a function into little increments and adding things up. Doing it this way might also reinforce the students' programming skills, or gently expose non-programmers to a little bit of programming.

I love derivations and proofs, and majored in math, but they are a deterrent to most people, and unrelated to their jobs like you say. I work for a company whose products are particularly mathematical, yet only a tiny handful of programmers (mostly the ones with science backgrounds) deal with the math related stuff.

Most engineers finish college, start their first jobs, and immediately become so busy with CAD and bureaucracy, that they forget all of their math and theory.

I think another problem is: What to do with kids who want to become programmers, and have been told that they must get a college degree in something, but programming doesn't really require 4 years of college study. Many of those kids major in Computer Science, which many people have pointed out, is not the same as programming. But selective admission into Computer Science programs becomes self fulfilling in terms of the demand to get into those programs. I took a different route, which was to major in math and physics. And I ended up doing something other than programming for my career, but that's OK too.


Calculus is useless, until it is not at which moment it turns to incredible help. You don't need to go super deep, bit basics are occasionally quite useful. More importantly, if you learn only what you strictly need right now, your work possibilities will be very limited.

Also, for some reason, I found abstract algebra related to objet oriented design - similar kind of thinking.

This is free course - they won't be in unfortunate 3000$ lock you was. (I think calculus is useful, but I don't think it is worth paying that much money.)


> Calculus is useless, until it is not at which moment it turns to incredible help.

Yes! And that's why you need to learn to be fluent in: Spanish, Chinese, Russian, French, and some other languages. You never know when you need them!


People seem to think that financial software, artificial i telligence, map software, math libraries they use, graphic software etc all emerged due to magic. They did not and all required calculus.


> Nobody actually needs order of magnitude either, so long as you can tell one thing is bigger than another which is about one day of learning after you cover polynomials.

If you're writing code that deals with list-shaped user data, you better believe you need this, because how the code scales with n is a constraint on the business.


Agree with other answers here. But it's useful for just some of the "easier" use-cases. Just throw an example here. Sometimes when proving the upper-bound of a value (runtime, probability of error, expected runtime, etc) it's helpful to use derivatives to show that a function is decreasing/increasing.


Back then, before the Bologna Agreement that brought the Yankee idea of modular courses and transcripts to Europe, professors would say that you wouldn't use half of your degree, but only when you left university you'd know which half you wouldn't use.

Me? I regret that I don't know nearly enough statistics.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: