This topic comes up every so often. I think my previous comment applies here [1]:
I started a Math degree after 16 years of programming without any Math beyond high school (the highest being high school calculus). Most of my work as a software developer didn't require any "higher" Maths.
Once I began studying math, including Modern Algebra, Analysis, Graph Theory, Category Theory, etc., I realized I understood many topics on an informal level, in a non-rigorous sort of way, through programming. I had a good sense of major algorithms and data structures as well as their running times. Once I did have more math under my belt, things did become easier, and I started to see connections and commonality between problems across different domains, i.e. more than one way to skin a cat.
Part of the reason I began studying math, is that I felt it was my limiting factor. The range of problems I could tackle as a programmer was limited by math. It turns out this was partly true.
The biggest misconception is that in Math there is one "correct" answer. This is almost never the case. Some of the most interesting solutions in Computer Science come directly from Math topics that were once considered "abstract". Likewise, some of the most interesting problems are solved through approximation algorithms of seemingly intractable problems, often requiring a bit of "hacking" and real world experience beyond what you'd get from a formal education in Math or Computer Science.
I'm in a similar boat as you. I went up through Calculus in school and hated it. Around 5 years ago I developed a strong interest in learning relevant applied math and have been enjoying it since.
Some things I'd add:
1) Math is fun! If you have the aptitude and disposition to enjoy writing software you'll love working out math problems. They're little nuggets of mental stimulation that you can work on with just some paper, a pencil, and maybe a pocket calculator.
2) You're spot on about an experienced programmer already having an intuitive but non-rigorous understanding of many concepts. It's mostly a matter of learning to read and write comfortably using the notation, which is really similar to learning the syntax and semantics of a big computer language with poor reference material.
3) You really have to have basic math down. This means going and re-learning stuff like applying FOIL to a binomial or dividing by a reciprocal.
4) Calculus and Linear Algebra are the father and mother of applied math. You'll save yourself a ton of grief if you learn them first (and I mean really learn them, maybe you took a calculus class in college but can you apply the Chain rule right now?). I'm learning Linear Algebra currently, which is something I should have done years ago. Part of the problem with self-teaching is getting things out of order.
I agree that getting things in the right order is important, but would argue that the order in which math is usually taken in the US is not the optimal one!
I recently took calc-1,2,3 and linear algebra through my local community college, and then started working my way through a wonderful book on mathematical proofs:(http://www.amazon.com/Mathematical-Proofs-Transition-Advance...), as preparation for working on higher level math. I would now argue that being able to understand and write proofs is a (the?) key mathematical skill to understanding what I would call 'real' (higher) math, and could be learnt by most students following high school algebra. My impression of the calculus series and linear algebra courses was an excessive focus on calculation, the math proof book was way more fun, surprising for a subject that is often thought to be too difficult for first year college students.
For those who are intimidated by the idea of a book on proofs (like I used to be), an example from the third chapter:
Theorem: Let x be an integer. Then x^2 is even if and only if x is even
It seems so simple, and I think would be accessible to anyone who had completed high school algebra but I found that even having done those calculus and linear algebra courses, I had now idea how to go about actually PROVING this! The book however, goes through the thought process step by step, and teaching the skills needed to be able to understand the real math books like Rudin.
I wish there were some sort of open-source prerequisite chain of what order to learn any subjects in. That's the hardest part of self-learning. Elon Musk had a reddit AMA recently where he was asked how he knows so much - he responded that he thought everyone had the capability of learning more than they thought, but that the key was to look at knowledge as a semantic tree. If you learn things in the wrong order, they won't have anything to hang off of.
Number theory. Set theory. Category theory. Combinatorics. Graph theory.
Linear algebra certainly has applications in some of the above. But I don't think that calculus & linear algebra can be fairly described as "father & mother" to these areas. (Am I wrong? I could be missing some connections; I'm not a mathematician.)
Not wrong - the poster above you probably means pure math in the analysis sense - real analysis, topology, functional analysis, algebraic topology. All of which are abstractions/generalizations (zoom out, if you will) from the real life world of 2 and 3 dimensional calculus/LA to N or infinitely many dimensions.
I probably should've thrown in combinatorics, which certainly existed before calculus or linear algebra, and certainly plays a role in applied math.
I would say graph theory is part of combinatorics, and set theory is part of logic.
Category theory was born out of trying to abstract the relationships between different objects in abstract algebra, so is kind of the child of abstract algebra and logic. I think it's fair to say the parents of abstract algebra are combinatorics and linear algebra.
Number theory at an elementary level is combinatorics, but at higher levels branches into analytic number theory (Calculus) and algebraic number theory ((Linear) Algebra).
> 4) Calculus and Linear Algebra are the father and mother of applied math.
> You'll save yourself a ton of grief if you learn them first
Baby Rudin and Axler are used currently by Harvard Math 55 to teach those subjects. Rudin might not be very didactic (I would be happy to hear about alternatives), but Axler is a fantastic choice.
If you liked Axler, you might check out Abbott's Understanding Analysis, also in the Springer UTM series. I think it covers somewhat less than Rudin (e.g. looking at baby Rudin's contents, I'm pretty sure Abbott doesn't touch Lebesgue integration) but it's a pretty great introductory analysis book IMO.
As a math graduate student, I second the choice for Abbott's "Understanding Calculus". It's a wonderful beginning book for analysis. Walter Rudin's "Principle's of Mathematical Analysis" is an amazing book but it's difficult to start with.
For a quick intro to Lebesgue integration you can read the beginning of Rudin's "Real and Complex Analysis" or Halsey Royden's "Real Analysis".
I haven't read Axler's book. I liked Hoffman and Kunze's "Linear Algebra"
It's fun because its incredibly rewarding! The elation of the "a-ha!" moment in math is second to none.
> 4) Calculus and Linear Algebra ...
Though I wouldn't get too caught up in the rigor of analysis or vector spaces right away. If you are self-studying, just spend enough time to feel confident computing and manipulating integrals, differentiation, and matrix math.
Then find a good intro to discrete math textbook covering a wide range of topics: number theory, graph theory, logic, set theory, etc, and learn how to write a "good" proof. This will open up a number of mathematical doors.
This has been my biggest realization as I started learning more math. What before seemed very arbitrary and unrelated becomes much more interesting and exciting once you have a bit of background. Unfortunately, I don't know of any way to get people to see the fun in math until they already know quite a lot of it... this was my experience at least, and seems to be pretty common among people who didn't gravitate towards math immediately.
Thanks for writing this. Over the last 4 years I switched from studying computer science with applications in mathematics, to studying math and symbolic logic with applications in computation. I did this alone, without interacting with anyone in the field. I thought I was going insane because of how many direct 'abstract' connections there are from computer science to mathematics and back again. I know these abstract connections exist as words in the world, but many times it feels like I have to go hunt for the word when I already have the idea.
I haven't really found any real world applications of the concepts I've learned, aside from having to hold a meticulously constructed symbolic reasoning world inside my head for a really long time without observational reality confirming it's correctness as a model to describe all things. This makes me pretty good at programming things that are incompletely described, I think, but also explains why Tarski said he was the only sane logician.
I never really hear about autodidacts talking about their experience. It can be really rough most of the time. I literally think it's just luck that I stumble across the right words. I also think it's luck when I manage to understand things and make a connection between them. I have managed to connect such disparate symbols together and maintain that connection strongly for long periods of time (with absolute conviction), that it all really seems like magic when it does work. But, giants, shoulders, yada yada.
> meticulously constructed symbolic reasoning world (...) connect such disparate symbols together and maintain that connection strongly for long periods of time
The construction and maintenance of my psychology using mathematics and symbolic logic to model, explain, extrapolate, analyze, and manipulate it.
I use computer science to explain psychology, in a way the makes the person being judged correct, instead of requiring their behavior to be altered based on personal opinion.
Imagine you have two conflicting sets of data from observation in your mind, and you have to process this data quickly. Taking an arbitrary and insufficient amount of data is selective and results in bias. Over time this results in contradiction even though both instances of inference are correct with regards to the logical model they rely on, and the data fed to the model. Now imagine that you received this data because over a short period of time, you have experience such a wide range of life experience that your observations allow you to collect both sets of data simultaneously and with correctness. Both data models model the world correctly, but when separated into distinct models of 'knowing things' instead of 'one confusing mass of data', you get contradiction.
So imagine someone endures trauma in their life, and has their mind molded in a specific way based on the current state of psychology, because over time the thoughts in the patients mind are shortly transformed to the thoughts in the therapists mind. Psychology did not experience the trauma, so how can psychology have an opinion on the consequences of bad things happening?
Making inferences adds to data and alters future data models and inferences. How people are judged while they are being 'helped' affects whether that help harms or helps them. I was in a group therapy for victims of domestic abuse and my "counselor" told me that she hated people like me.
There's no dependency between learning functional programming, including Haskell, and category theory. I say this as someone who is reading a CT book (Conceptual Mathematics) after being first exposed to the topic by learning Haskell.
> The biggest misconception is that in Math there is one "correct" answer.
Well written...and I'd argue the same is for both Computer Science and software engineering in general. When teaching beginners, it still astonishes/annoys me how many students tell me, "My program didn't work"...as if there was just one reason why it didn't work, as opposed to hundreds of possible reasons.
I wish more people would scream this from the rooftops. There's that whole movement trying to get more people into programming like code.org and they all say "If you're good at math you'll love this". I feel like that's scaring away lots of people who would otherwise do just fine.
Examples, there's almost no math running Hacker News. There's no math in programming most blogs. There's no math in most apps. There's no math in most text editors. etc etc etc. Most programs don't need anything more than arithmetic.
I'm not saying math won't help with lots of problems. Like you said you found it limiting at some point. But you managed 16 years as a programmer without much math. I'm in a similar boat. I've shipped 17 commercial games, written 6 game engines, world on Chrome for 5 years. My math sucks. Would I be better if my math was better? Of course! But that I've been productive without much math knowledge shows, at least one data point, that you don't have to be good a math to program
> I feel like that's scaring away lots of people who would otherwise do just fine.
I am much in agreement with this. I know that the guidance counselor on staff when I was in high school would heavily steer people away from going into computer science/programming if they hadn't completed the entire catalog of math classes available at our school. Her assumption was that you needed to be some sort of math wizard in order to be successful in a CS or programming degree.
I would have thought game engines would require a strong grounding in maths across a lot of their "moving parts"? Not being dismissive, just curious how you found working in that environment without strong maths skills.
Modern 3D game engines need math in the renderer, physics, maybe the AI but the majority of a game engine doesn't. Loading files, reading inputs, displaying UIs, the object system, scripting languages, most tools, font rendering, localization systems, game save system, networking system. Nowadays teams either buy middleware for the parts that need math or else they have specialists on the team for those things.
On top of that lots of game engines aren't 3D. 2D Mario? No serious math in there, at least not the SNES/NES ones. The 2D Zeldas? Even less. 2D Metroid? Probably less than Mario. Those games didn't use real maths for physics which is about the only place they could possibly use anything more than basic arithmetic.
> Part of the reason I began studying math, is that I felt it was my limiting factor. The range of problems I could tackle as a programmer was limited by math. It turns out this was partly true.
I started studying math intensely (doing every exercise in books, etc) when I realized the same thing: math was a limiting factor for my programming ability. Michael Abrash hints about this in some article, and I sneered at it until I realized it was true.
I considered going back to do a math degree but the amount of hassle involved, as well as other life changes required, made that impossible.
I would like to know how it worked out for you. Are you glad you did a degree program? Do you feel you met people and made connections that were valuable, that couldn't be made by an autodidact?
It was a big hassle and a big life change. But I've always had in the back of my mind that I wanted to do it and I'm glad I did. I put it off because the money I was making, and the places I was traveling, was too good to pass up at the time. So I saved money with the idea that I would get the chance to go to school.
I did try to learn math as an autodidact before I began the degree. The more abstract the math, the harder it was for me to self-study. It was inefficient at best. At worst, I'd hit a wall and not have anyone to reach out to.
I am guessing you went for a graduate degree in math. How hard was it to get admission? Would you like to recommend some good schools for doing something like you did. I am just a little younger but hungry for math knowledge.
Thanks for the reply. My solutions to the problem of not having anyone to reach out to have been to make friends with mathematicians, and to hire grad students as tutors.
I had a couple of things in mind. One was means. That you can solve a problem by restating another way, in a way that it's almost indistinguishable from the original. e.g. solving a problem in linear algebra that also solves a problem in graph theory. solving a group theoretic problem that gives you an answer in topology. solving a problem using category theory that gives you a combinatorial answer, using quaternion algebra to compute a rotation, etc.
The other thing is that in Math you are often dealing with the same question but with very different objects or variable types. The same question where your numbers could be real or complex, integers or finite fields, vector spaces or topological spaces, etc., change what the "correct" answer to the question might be.
While this is true, I've always found applying mathematical analyses such as algebraic reasoning to computation admits a single, minimal, canonical solution in the end.
Mary L Boas "Mathematical Methods in the Physical Sciences" [1] has absolutely been my favorite and most-used maths text in the 9 years since graduating uni. It's like a reference manual of just about all the non-CS (i.e. continuous/non-discrete) mathematical techniques required in my career. Highly accessible. It's a little too terse in places but I prefer this style of presentation over the insane long-form verbiage in other books I've since discarded which can make even simple topics seem overwhelming: the "Boas" book gets right to the point.
Edit: Calling it a mini-TAOCP of most of the maths needed for physics/EE work might be a bit of a stretch, but I've yet to see another maths text that does better as a highly readable, self-contained and compact reference.
Edit2: I moved house once and thought I'd lost my copy from university. I eventually found it, and yes, I have two copies... It's that important to me for brushing off the things I've forgotten :)
It is a fantastic book. It isn't (only) a reference book though. For me it was the best way of learning the maths used in my physics degree.
The book has many worked examples, and the extensive end-of-section questions have the answers in the back of the book (for every 2nd question). This means you can learn by "reading then doing", and see if you have got the answers right - something many textbooks lack.
When I try to learn from other technical books, I often find myself thinking "I wish they'd written this in the same style as Boas".
I see a lot of complaints in amazon reviews about the lack of answers making self-study difficult.
It got me wondering...suppose there were a website for autodidacts in math and similar topics? Something where people could post and discuss their answers to exercises. It'd solve the whole problem.
You can use math.stackexchange.com for this today. It's frowned upon to just ask an exercise from a book w/o even trying to solve it, but if you show that you made an effort but got stumped, or if you show your solution and ask if it's correct, people will gladly help you out.
I think academics would get pretty annoyed :) Most of my professors just use the exercises from the textbook for homework, usually on the assumption that you can't find the answers online.
I'm sure they would, but I'm more concerned about people in my shoes. University tuition has gotten so expensive these days that I think we need solid alternatives...and that they can use some of that fancy tuition money to write their own exercises, if they don't trust students to do their own work.
Or they could just trust the students. At my university the honor code such a big deal that they let students take closed-book tests at home.
Not a problem - Boas has the answers for every 2nd question - for tutorials, we would be asked to do the ones without answers.
Maybe DennisP's idea could do the same thing - only post answers to the odd-numbered questions. Of course, DennisP's scheme would only work for books that actually have decent end-of-section questions, unless people made up extra questions as well ...
Concrete Mathematics by Graham,Knuth, Patashnik is (explicitly, even) a mini-TAOCP for much* of the mathematical underpinnings of computer science.
* I say much rather than most or all since it's focused on asymptotics, recurrences, number theory. Modern theoretical Computer Science draws on a much wider variety of mathematical methods.
Concrete Mathematics is outstanding, and I'm happy to think I'm getting an overview of TAOCP by (very slowly) working my way through Concrete Mathematics.
What I understood after studying CS for over 10 years at a few universities regularly ranked highly at ACM ICPC is that math is unnecessarily obfuscated to most people. There is even an excellent book "Concrete Mathematics" from Stanford that tries to bring fun back to math instead of drying people with some formal stuff without explaining how people over the centuries got to that structure.
I honestly believe math language is seriously outdated. It's like using COBOL to express everything. Yes, you can do that, but would you really want to given a choice? The most trivial things are so insanely complicated in math it's unbelievable (try to describe geometrical objects with the current math formal language if you do computer vision), yet there is very little work on developing better formal language of math. It's like with Turing machines and the complexity theory - who is going to move around a tape in the real world besides some specialized biological systems, not mentioning magical 'oracles'? Those abstractions were useful in their day, brought their fruits, but why do we still stick to them and just increase the gulf between more and more closed-unto-itself-theory and reality? Yes, it's great some theory is super cool but what do we do when we find in 20-30 years that the set of objects satisfying this omnipotent theory is empty? And when somebody like Mochizuki invents their own formal language to solve some cool problem like ABC conjecture, we all hate him, refusing to read the proof because it doesn't follow our outdated formal ways...
I agree strongly that math is poorly taught most places and that Concrete Mathematics does an admirable job of teaching math in a way that gets to the beauty of it. Oddly enough, it wasn't until I threw myself into more advanced mathematics that I started to see the fun in it. I absolutely hated learning calculus and linear algebra even though I could do it well. It just seemed so boring. I don't have a good solution to how to make things better, however, the mathematician Paul Lockhart has given this a lot of thought.
All that said, I think there are a number of misconceptions here.
First, mathematicians invent new languages all the time. That's the point of definitions, otherwise we'd be using sets to describe everything. The problem is, you first have to understand the concept well in order to apply suitable definitions. Think probability before Kolmogorov.
Second, Turing machines are a formalism to introduce you to the theory of computation because they are the simplest (or close to it) thing that can compute in the current sense of the word. Once you learn how TMs work, pretty much everyone just accepts them as a given and deals at a higher level.
Third people are trying to read Mochizuki's proof, but it is very hard. He basically invented his own way of doing things and so to understand his proof you first need to understand his methods. It's understandable that professional mathematicians with their own careers and areas of research find it hard to read the ~1000 pages of dense mathematics (proof + prior papers) to understand what is going on. Most people probably haven't read 1000 pages of math in their life, and it takes a while to come to terms with it no matter how smart you are.
This is what drew me to CS over math. I enjoy describing things with code or pseudo code, not difficult to understand equations and math symbols. Also, it is much easier to see the practicality of an equation when presented in code form.
"Computer science" can mean many different things, but theoretical CS is largely considered a branch of mathematics. While there is some 'code', it has all of the same formalisms, equations and proofs. CS papers can be just as inaccessible as other types of math, if not more so.
After I read SICP and HtDP I started retaking math I had forgotten but writing out the equations in Scheme to further grasp the language. At first doing Spivaks Calculus in this method took a long time but now I can write formulas and basic proofs just as fast in a programming language as I can with a pencil.
I skimmed SICM (Structural Interpretation Of Classical Mechanics) just to get an idea of how they represented Langrange equations in Scheme and went from there.
This is something I like doing. Something I keep meaning to do is take it further and write my proofs in a proof checking language, or at least write "unit tests" for my proofs with one (interleaved with the natural language proof with org-mode!), but they all seemed pretty unwieldy. Has anyone tried this with any success?
For your analogy with Turing machines, do you have a better mathematical model of a computer? Nobody works directly with the TM model when they're doing theoretical CS anyway, they just describe the algorithm and everyone understands that if you really really wanted to you could work it down to a TM.
The same is true of mathematics. It might be hard to get everything precise, but that's not the point of mathematics. Nor is the point to be close to reality.
> All you have to do is demonstrate that TMs can be simulated in JavaScript, and that's fairly easy to do since TMs are so bare-bones.
Actually I think this is fairly non-trivial. Sure you could make a "compiler" that compiles a JavaScript interpreter down to a Turing machine, but you would almost certainly not understand the generated states and transitions.
A more elegant model of computation is the lambda calculus. It's also very bare-bones, but it's easy to imagine writing real programs in it. Functional programming languages, at their core, are just lambda calculi with some syntactic sugar.
It's practical, but it's also a good foundational model. It's easy to reason about, due in part to the fact that it models familiar math (partial functions).
There are far simpler models for formal verification than Turing machines, e.g. Smullyan's top-down tableaux method - you make a simple functional snippet and immediately verify it using mostly general induction and easy-to-understand verification steps that can be almost automated. Going all the way to the Turing level would kill you time-wise to get to anything useful (even preparing description of your JavaScript machine in Turing terms) - Turing machine has infinite time available, you don't.
Not to mention there are some issues with formal logic that might cause you problems (hint: why do medical doctors use counter-factuals and not mathematical logic?)
"Beware of bugs in the above code; I have only proved it correct, not tried it" -- Donald E. Knuth
I tried going through "Concrete Mathematics," but I could only do a handful of the problems in the first chapter after trying for a couple weeks. I'm not sure why it seems so hard to me -- I've taken a couple "higher math" courses and am generally considered "pretty good" at math. Perhaps I'm better at bullshit than math...
I always thought that being an Engineer assumes you have knowledge of mathematics. The way this word is being used in Software ist kind of strange. "I built a blog in php, I'm a Software Engineer" - this feels kind of awkward to me.
Most of us (engineers of any kind) learn quite lot of maths during uni, and most of us forget 90% of that as it's not needed in most jobs, which is a shame.
The thing I have needed all the time is statistics and prob. theory, that keeps coming up literally everywhere. Calculus - not so much. If you need something, you can always re-learn it quickly. (For example, I learned quite a lot of linear algebra, but haven't used it for ~10 years, so when I had to write some 3D gfx/shader code, I had to spend like two days on quaternions, etc.)
I find my self surprised just how often Linear Algebra pops up. A few years ago I was doing some work with Splines, trying to smooth some noisy plant data and I had to relearn matrix decomposition (LU factorisation). Every year or so since I've been ambushed by some other "stealth" aplication of linear algebra in something you'd otherwise think would not be applicable. Nowadays my old beat up linear algebra text book from uni sits right beside "C in a nutshell" and the printed GDB manual on my shelf at work.
And that assumption is wrong. You don't need any knowledge in advanced mathematics to engineer many types of software, that doesn't make the creator less of a Software Engineer.
Being an engineer means using technology, science, mathematics to solve problems. Well in many cases you don't need math to solve these problems. The word itself as no root in math either, it's based on latin for devise/contrive, sure it was used for builders at first, where math was important, but context changes.
Also Software Engineer is wide and reaching, a 3D GFX Software Engineer will need some heavy mathematics to do his job and do it well, a Web Software Engineer not much, but he'll need to know a wide range of other knowledge GFX guy doesn't know (HTTP, network protocols, various languages, server technology, database technology, browser knowledge,...).
It's not about being good at mathematics, it's about knowing as much as possible about the domain of knowledge your role entails, and surprise, not every one of these requires deep math knowledge and understanding.
>Being an engineer means using technology, science, mathematics to solve problems.
That's called practicing engineering. An engineer is a professional practitioner of engineer. You can practice engineering all you want, but if you're not a professional (having received an engineering degree from a certified university), it's dishonest to call yourself one. Honestly, it is elitist. But those of us who obtained our degrees worked our asses off. And I personally hate when people abuse the term to mean anything that took skill. "Candy cane engineer". "Beats engineer". "Drink mixing engineer". It's linguistic prescription to make yourself sound more important/skilled.
Plus, do you really want to drive across a bridge every day that was designed by someone you don't know, someone who was self-taught in the ways of technology, science, and mathematics?
I love this answer. I find technologists can often be complete snobs about their particular niche of expertise. Math, computer science, computer engineering and software engineering are each way too big for anyone to be a true expert in any of them. I find once you've spent some time learning multiple disciplines you tend to realize how little you really know and become a bit humble about your own abilities.
I'm employed as a Software Engineer now but my degree is in ECE. The engineering curriculum at my university (Rutgers) was extremely math intensive. I didn't take a whole lot of software classes but I'm glad I did take all that math. I think it's dishonest to call yourself a Software Engineer and not have an engineering degree. Semantics, but I earned the title.
I understand where this feeling comes from. You worked hard to earn what is (or what should be) a title with prestige: engineer. I too worked hard to earn degrees in engineering. I try to live up to that title every day, and I'm sure you do, too.
But my years of engineering experience have shown me that the title "Engineer" is really more about how you approach problems and what you do to solve them and less about degree credentials. Now, the following example is in the context of electrical engineering on airplanes, not computer programming, but I think it holds true.
One of the best people I work with does not have a college degree. But over years of self-study and real world experience, he has taught himself electronics, some computer programming, and enough mathematics to get by. And when there is a technical problem to be solved on one of our airplanes, he will chase after it relentlessly, and smartly, until it is solved. His system designs are clean and well thought through. He has taught me much about designing for real world implementation. Is he not an engineer? He does more than many of my coworkers who are degree holding EEs. I am not afraid to call him an engineer, because he has earned the title in a different way.
Right, I understand your point. I said in an earlier post that an engineer is one who obtains an engineering degree from a certified university. I realize that it's pedantic and elitist, and there's definitely a part of me who thinks it's idiotic. Because it is. It's a title, who cares right? It doesn't lessen the worth of my degree. I've been trying to come up with valid reasons for why it's important for only people who fit my definition to be called an engineer but I can't think of any. So you're right.
However, I still think it's wrong for people who don't exhibit these qualities to call themselves an engineer. If you make sick beats on your macbook, that's great. But don't call yourself a beat mix engineer.
Edit: Thought of some reasons
It's similar to the "doctor" title. You can be the worlds greatest doctor. Self-taught, you can do everything from intubation to surgery. However, you're still not a doctor. You practice medicine. Why? There's things that you can only learn from someone who is more skilled than you, and who is skilled at teaching. That's what a professor is (simple definition). They are an authority on their topic and are the best place to learn from. They teach things that books don't cover. They have experience. They can tell you when you're wrong, and unlike a book can teach you the most current standards and techniques.
Another point is the completeness of education. Your coworker, does he know vector calculus? Linear algebra? The forward-active voltage for a BJT? Maybe. But there's no guarantee he does. A degree from a certified university guarantees that you know the salient points of your field (not always true, but for my argument it is). If you don't have a degree, there's no guarantee. And this knowledge is important.
To say that a professor is the best person to learn from isn't 100% true. I had a linear algebra professor who simply lectured by reading straight from the book and then occasionally drew the diagrams. Alas, he apparently was a valuable research professor and had tenure, so there's little that could be done. In this case, the professor was hardly the best approach towards learning the subject.
And as for experience, well, in the case of a CS student wishing to enter industry there's a good chance that the majority of your professors never even worked in industry. So if you're looking for people with experience to learn from, well, then you're in quite an unfortunate situation.
This led me to conclude that a degree offers no such guarantee that someone knows something. It offers a guarantee that someone was introduced to a number of concepts and demonstrated an understanding (or knack for cheating, cramming, what have you) good enough to pass and move forward. This is why I shudder at the thought of hiring old classmates who had to be hand-held through their 4 (or more) years of university, I know better despite what their degree might say.
Which is why that people who earned a degree...earned a degree, that's it. As far as I can tell they have no right to call themselves an engineer until they begin to practice engineering and practice it well enough to demonstrate the value of their thinking.
The ideal professor is (a) a world-recognized expert in the appropriate field, and (b) a really good teacher. Frequently, you'll find one or the other, but not both. Far too frequently, you find neither.
But the best professors I've had (and I've had a bunch) were the ones that really did combine both. (And frankly, I can forgive a lot of poor teaching in return for a "well, this is technically true, but no one really does it that way; they use this shortcut...."
As for a university education not providing immediately applicable industrial experience, well, that's kinda not the point of it. Sort of the difference between passing the FE exam and being a PE.
My coworker knows some calculus. I doubt he knows linear algebra. The forward-active voltage for a BJT? Maybe one other person I work with knows that, I doubt he does. I did not when I graduated, but then I am not a EE. My degrees are in Engineering Physics and Fluid Mechanics. Almost by accident I have become a flight test instrumentation engineer with the official job title of "Senior Electronics Engineer". I have tried hard to remedy my EE related shortcomings through self-study, and will continue to do so. I would never call myself an EE -- just an engineer.
I agree with you, for the purposes of your argument, that degree should serve as a guarantee. It is an important signifier of mastered domain knowledge, and more importantly, a signifier of the ability to master new domains.
There's more to it than that. In most places, the title 'Engineer' is a legal entity. In Australia, signing off on a design as an engineer makes you legally responsible for guaranteeing that it has been correctly designed, and is safe for public use. This includes personal liability in the case that it fails (bridge, software, whatever), and can be shown that it was not designed according to appropriate standards, or what should 'reasonably' have been done.
So basically, don't call yourself an engineer unless you're willing to sign off on something, and be legally bound by it. This implies a strong background in problem solving and structured design processes, to remove as much risk (both personal and to the public) as possible, which is also vital to engineering.
In the US we have the title PE, Professional Engineer, which requires some additional qualifications (exams, supervised apprenticeship, etc), and of course a state license. Most people don't bother any more, which has lead to a much smaller number of PE-Engineers many of whom are relegated to being mere license-holders who sign drawings for others, who do the actual engineering.
>So basically, don't call yourself an engineer unless you're willing to sign off on something, and be legally bound by it.
I'm sorry to say it (not really); but for a number of reasons, some good and some bad, the title has been co-opted, and there is no going back.
>This implies a strong background in problem solving and structured design processes
As far as that goes, I've met a number of pedigreed folks who can't engineer their way out of a wet paper bag.
> Most people don't bother any more, which has lead to a much smaller number of PE-Engineers many of whom are relegated to being mere license-holders who sign drawings for others, who do the actual engineering.
To some extent this happens in Australia as well, although there is a movement both to require things to be signed off by a PE (or CPEng here), and to have those engineers provide documented supervision of the work they sign off on.
> As far as that goes, I've met a number of pedigreed folks who can't engineer their way out of a wet paper bag.
No argument there, certification is never proof positive of competence. I've met very good engineers who aren't Engineers with a capital E, and very bad Engineers who knew enough to fool a test board, but not much more.
The existence of these licensing schemes is far from perfect, but better than nothing IMO. Applying the concept to general purpose software is another discussion entirely!
Being an engineer is about more than just technical skills, though. Engineers get it beaten into them that their first duty is to protect the public; their employer comes second. That means having the backbone to say "No" to unsafe demands, even if it costs them their job.
If he's as good, and as experienced as you say, he should be able to just do the legal/ethics stuff and get a license. Unfortunately, the professional associations are streamlined for people who take the usual path through university. At least in my jurisdiction, it is technically possible to have the experience counted, rather than the degree, but it's much harder.
The system is clearly not perfect, but when you ask the average engineer whether it's ok to do things like lie about their experience, you will get very different answers than if you asked the general public. The gatekeeper is doing a real job, even if they don't do it perfectly.
I have found real world experience and self-study vastly underused on a lot of engineering projects. I'm not taliking
about Programming, or CS. You guys can succeed without a degree. I'm talking about the mechanical, and electical engineering projects. I have tried to decipher too many blue
prints form licenced Electricial Engineers--whom I can guarantee didn't spend one day on a construction site. Getting your hand dirty counts for something. Know the theory--essential, but know how to put it together too. I have met mechanical engineers who can't work on their own automobiles--which I
found baffling, because these guys didn't have a lot of extra
money to spend--at least when I knew them. I do still kind
of cringe when people throw around the title of engineer though, if they don't have the math backround and license to
back it up.
In some countries it's illegal to call yourself an engineer without having an engineering degree. The degree is "protected" by the government and can only be acquired from certified organizations (like universities).
Can't stress this enough- in Canada (Ontario specifically) it is a serious fine for anyone caught using the term without being certified by a professional organization.
All provinces have a comparable professional organization, all of which are members of Engineers Canada. We have the notion of "self-regulated professions", where the government gives a charter to the professional organization to regulate use of their name, their members, etc. For Americans: it's not comparable to the IEEE, it's comparable to the College of Physicians and Surgeons (also a Canada/UK thing the US doesn't really have). So they're given the same power to penalize people "impersonating engineers" as the royal college can penalize people impersonating doctors.
This is a perfect example of why the term is protected by PEO. I used to think the PEO overreaching in its attempts to protect the term/designation, but I can see why they have taken issue with the term "Software Engineer" being genericized.
I applaud the effort to learn the topics of "statistics, probability, and linear algebra", but these would have been relatively fundamental courses in most software/computer/electrical engineering curricula that I've known about, and most definitely a prerequisite to calling oneself an engineer.
In some states, which regulate such things. I don't know about fines, but if a gang from the IEEE or the ASME catch you wandering around wearing their colors without being a member,....
>I think it's dishonest to call yourself a Software Engineer and not have an engineering degree.
Definitely semantics. For example, I have a BS/MS in applied math from a good engineering school university. I am a software engineer mainly working with ECEs, physics, and other math guys, who are all "software engineers".
I see where you're coming from about being a SW eng without knowing a lot of math - but there are other majors - math/physics/stats etc that will be very math heavy and not "engineering degrees".
Right, an engineering degree is a guarantee that you know the important points of what it means to be an engineer. Ethics, engineering process, math, etc. When you get an engineering degree from a certified university, you are an engineer. It's a guarantee that (at one point, at least), you possessed the set of skills and knowledge determined by a board of professionals to be necessary for a career of engineering. Other degrees may overlap, but they're not held to the same standards.
I think one thing that's understated is the amount of time we spend in ethics classes. I felt that there was always a semester where I was in some form of ethics or engineering history class.
>When you get an engineering degree from a certified university, you are an engineer.
That's not true. When you get a degree you have an engineering degree. When you get hired and employed as an engineer, you are an engineer. You have an engineering degree and I have a mathematics degree. Our employers hire people for engineer positions and call them such. If I were a professor of math, i could call myself a professor. If I were a mathematician at the NSA, a Mathematician. But my employer calls me an Engineer. The degree does not do that.
Since we're getting pedantic, once you are employed straight out of uni, you are still not an engineer. You are a graduate, or cadet engineer. Typically its not until you've have several years of experience in an engineering capacity, and have passed your government's regulatory body's requirements, that you are actually an engineer. Usually this involves submitting a number of essays on your work, and then passing an oral review board.
@dsuth - that depends on country. Some employers in the US will take a new hire and call them an engineering intern if they require time and testing to reach PE status. That is not required and other employers will grant an Engineer title immediately.
And I'm not merely referring to software. The above is true for my companies 20+ different engineering positions across all disciplines they hire for (Aerospace, mechanical, electrical, materials, software, etc).
I obtained my degree in Software Engineering at a university where engineering grade mathematics and physics were mandatory subjects.
It really annoys me that I spent 4 years studying this to find that other people who scrape together snippets of JavaScript & PHP feel entitled to call themselves an "engineer".
A great many real, live engineers that I know would object to anyone calling themselves an engineer without a lot of study of the big three: statics, dynamics, and thermodynamics; not just general mathematics. It's one of the reasons I don't call myself an engineer. (But I do get to harsh on them a bit about not being professional programmers---their code isn't pretty.)
The other reason, of course, is that the "software engineer" term comes from a group of people who really wanted the respect that comes with "engineer" but realized that the big three don't get very far, software-wise. (And coincidentally didn't want to do all that icky math stuff. Not to mention much of the icky programming stuff.)
Ironically, EE had by far the most maths and maths-related courses of the other degrees, at my Uni. But we were still required to study those ones, along with the civil and chemical engineers.
Originally engineering meant something like applied science for maximal profit, or something like "the accountants of the science world", which requires little if any higher math.
A filter / weedout system was required because of too many students, so its turned into something else entirely and now a engineering degree often means nothing other than having passed the weedout math classes. It really shows in some new grads that don't have any actual engineering skills but are really good at calc problems.
Feel exactly the same way. Why not just call yourself a software developer? Because software engineer sounds better. Because people expect a better math foundation.
Lived and breathed math for 5 years in college before I could look myself in the mirror and call myself an software engineer.
I always thought that being an Engineer assumes you have knowledge of mathematics.
Not really. Mostly just algorithms and procedures of some math concepts(Calculus and Linear Algebra) which roughly corresponds to the first year of North American math major.
My engineering physics curriculum required three semesters of Calculus, a differential equations course, and a numerical methods course. Notably lacking? A course in Linear Algebra, which was not required (and most in that program did not have time to take it, as the program was more like a double major with additional technical electives).
This could not be further from the truth. I doubt my CS professors would be able to solve a PDE (or algebraic geometry etc etc). Likewise, few math professors would be able to write a parser generator (or code worth a damn).
In contrast, I know engineers who live and breathe PDEs and tweak compilers to solve them faster.
PDE's are not discrete math. CompSci majors are more interested in topics like Combinatorics, Graph Theory etc. PDE's are more interesting to applied mathematicians and physicists.
I doubt my CS professors would be able to solve a PDE..
Beyond the basics, not even many math professors can do that. Math is too vast and people specialize. Strong algebraic geometers are not necessarily strong analysts or algebraists or logicians.
1) No such thing as universal mathematician in this day and age.
2) Engineer's PDEs(algorithms) are not the same as mathematician's PDEs(theory). Same as comparing a student in China who learned English to communicate with English speakers to English majors from English speaking countries.
It wasn't that long ago that most schools did not have a separate CS degree, but rather it was a math degree w/ a concentration in computer science (or some other verbiage to describe the same thing)
Depends on your school. UW CSE (a top 10 program) would be called rigorous but is not as theoretical (there is math, but nowhere near enough for a math minor).
I went and checked the undergrad curricula back at UMass Amherst where I went. I was incorrect: a generic Comp Sci program there stops three courses short of a math minor. The more theoretical concentrations - such as Theory of Computing, Machine Learning, or Programming Languages - teach enough theoretical content to come within one course of a math/stats minor, or even cross the line.
So yes, if you take a Computer Science major with a focus on Software Engineering, you will not learn enough math to minor in math "for free". If you pick a more mathematical subfield to focus on, you should probably declare a math minor for the one or two additional courses it will take you.
I made a truly stupid choice: I graduated in 7 semesters with a Comp Sci degree concentrated on PL theory without picking up the additional courses for a math minor. As a result, I'm "condemned" to learn that material independently later on. "Luckily", the Technion required me to do extra coursework for my MSc, so I've had to buck up and learn more theory.
Now get off my lawn until I'm done with my highly theoretical machine learning exam ;-)!
Minors are mostly meaningless on your degree; but the knowledge you get from the classes is worthwhile. I took some advanced math classes but kind of burned out on it towards the end of my degree (ironic since I went to grad school immediately after). What I did notice: CS classes are easy compared to advanced math classes...man, I struggled to eek out B's in the courses when CS classes were just easy A's.
If you want to do good at machine learning...electrical engineering is probably a better choice; the maths learned in EE overlap fairly well with what is needed to do ML. PL theory is quite niche, even for PL researchers.
I was actually being cautious. American colleges like to make sure you pay for as many BS courses as possible. Calculus 1, Calculus 2, Calculus 3...
Besides, I never heard of math majors taking cryptography that early. Usually, Intro to Real Analysis, Abstract Algebra and Abstract Linear Algebra come first.
About 2 years ago, I had an intense urge to learn linear algebra, statistics and probability in depth for much the same reasons as the author - to improve my machine learning and computer vision skills.
Really glad to see I'm not the only one.
I don't have any recommendations for linear algebra, but for stats and probability (which I always found intimidating in the past), Allen Downey's "Think Stats" and "Think Bayes" did the trick.
I'm enrolled in that class. In the introductory lecture Professor Klein shows a flowchart of the course†. This is the best course overview I've ever seen in any math class! I've spent /semesters/ in math sequences thinking "how the #@%! does this all fit together?!?!"
He says: "Don't try to read it all. It's a map... It's there to help you keep track of where you are and where we're going." Every professor should do that for their course. And every department should put up a big poster with something similar: these are the subject areas you will study, how they relate to each other, and the courses that cover those areas; if you choose this specialization these are the areas you'll focus on. Put-out a mind-map of the subject area that relates to the available courses—help students start building Elon Musks' mental-hyperloop / semantic-tree.
Well written, this resonated with me. My day job is a software engineer. At night I have started to build a speech synthesizer. I had some math knowledge before, but I need to constantly improve it. (in the territory of signal processing and everything machine learning related.)
A related "Ask HN" from a couple of months ago: How or where to begin learning mathematics from first principles? [1]
I'm at the very start of what I hope might be a similar journey, and have signed-up for a Coursera "Introduction to Mathematical Thinking" course. I'm hoping it might give me some insight to build on. The course starts in about ten days, so apprehension hasn't kicked-in yet.
When I pick up a book and start to learn something new (which is pretty much what I do in my free time), I usually do it with an IDE at hand. I try to implement as much of what I learn in code. If I can teach the computer how to do it, then I am sure I really understand it, and it often helps me find holes in my understanding.
I have significant trouble with doing this with mathematics, though. Programming languages (at least the ones I know) just don't seem to be good for expressing things like identities and invariants. Anyone have any suggestions on how to handle such things?
I own this book and I'll tell you it's not easy learning math from this book. Most of it is just very light overview. There are much better rigorous textbooks that are simpler and more complete.
There are some great textbooks translated from Russian. Analysis by Kolmogorov, (rigorous) Linear Algebra by Shilov, Complex Analysis by Markushevich to name a few.
The book covers too much to be thorough. Each chapter gives good introduction to the subject matter and ends with list of suggested reading.
I always read the relevant parts from this book before going deeper. Not everyone is going to dwell into non-euclidean geometry, functional analysis and topology.
Furthermore, I don't think typical self studying engineer in Hacker News wants to learn math using rigorous introduction to analysis. You can get good working knowledge and intuition without knowing what delta epsilon is.
I agree with you, that book is great for giving an overview of the general areas of mathematics and for providing context before going deeper into an area. I've used it to get some background in the courses I'm taking classes on before the semester starts and have found that really helpful.
This is a good book, although I agree with others that truly learning from it would be difficult. If you like this, another book you might really enjoy is the Princeton Companion to Mathematics.
There are some numbers for this in David Bellos "Is that a fish in your ear?"[1], chapter 19.
An UNESCO study of translations between Swedish, Chinese, Hindi, Arabic, French, German and English over a decade showed that 104,000 of the 132,000 translations made between all those languages were translations from English.
>Yeah, no. Usually good books get translated, period.
Great books certainly get translated in every direction. But for merely good books, I wouldn't be surprised if readers outside the US consumed more books translated from English than readers in the US consume books translated from other languages.
I don't think so for the simple reason that academic books in English usually aren't translated because people in countries outside of the US can read English. Even academic books that do not have any native English speaking authors are usually written in English. Books are more likely to be translated to English than from English, because translating to English multiplies the size of the audience many times, whereas the other way around does not.
Considering that the average reader reads more in Europe than in the US and that there is a very dynamic domestic industry in many of these countries, I would say the opposite.
And I didn't even take India and China into account...
If you don't mind could you list some more books that you think are particularly great? Topic to me is not so important, as I am interested in just books that are the best in their field. Much appreciated!
I'm taking Discrete Math now and have both books. Epps' has much simpler explanations. I haven't taken algebra in 5 years and find that Rosen's assumes a lot more knowledge, frequently skipping lots of steps in its explanations, which can be very confusing if you are rusty in the basics.
I think maths is amazing. It's a useful tool to have in one's arsenal regardless of your profession or walk in life. It's also not the sole domain of the highly educated elite. Anyone can do it.
It's not important to be "correct" all of the time so much as it is to be curious and willing to learn -- and willing to share.
Keep at it! There's so many cool things you can do as you learn more!
"On Numbers and Games" is a great book but it can be tough. "Winning Ways for your Mathematical Plays" by Berlekamp, Conway & Guy is a gentler introduction to surreal number. It's also very fun.
Barbara Oakley mentioned in the text teaches the course "Learning How to Learn" on Coursera [1]. It is about the science of learning in general, not just learning math, but it is very accessible. It's only 4 weeks, and the lectures are easy to understand. I knew most of the techniques before (spaced repetition, recall, proper sleep etc), but it was still good with a review.
I lost interest in mathematics starting in tenth grade. Looking back into the haze, it wasn't so much mathematics as high school in general and by extension industrialized education. Boredom begat indifference. Chaos reoriented priorities across starts and fits with mathematics in post-secondary education.
I enrolled in Coursera's Discreet Optimization about a year ago. It was way beyond my ability [No SoA], but I learned a hell of a lot about computing and solving hard computational problems. It also suggested that mathematics might be important when grappling interesting problems.
For fun I watched a number of Strang's Open-Courseware videos on matrices. I hit Chapter 1 of TAoCP. I treated it like programming - it was ok to only partially understand. I took an Algorithms MOOC or two. I actually enjoyed following the analysis even formal analysis is not something I would do for pleasure.
The turning point though was when Lamport's "Thinking for Programmers" [Lamport: Thinking] hit HN. It threw down the gauntlet. Specification is a prerequisite for a commitment to getting it right, where the it is computing. And when the it is computing, we are talking about math whether we like it or not.
It's not that the math and specification replaces testing. It's that tests are ad hoc without a mathematical understanding of the computation.
It's not a misuse of vocabulary, it's a misspelling. As my typing speed trends toward my thinking speed, the mechanical action becomes more tied to phonemes. I've found myself increasingly typing "are" for "our" which is fairly classic. "One" for "won" not so much.
There's a lot of title inflation in the US, where someone who only knows some HTML, CSS, and a bit of JavaScript, with little formal engineering education of any kind, can be called a software engineer.
Many places, including SV, have companies that give that title even to those who haven't studied CS or math formally.
But besides that, I see a deeper motivation in the article. The author says
"My dream is to learn the statistics, probability, and linear algebra needed to really understand machine learning and computer vision...I need a solid foundation so that I can truly understand what's going on: why something works, when it won’t work, and what to do differently if it doesn’t."
I contend that even many who have formally studied CS and math probably don't truly understand these math tools, that is, if they are using them in the first place. Intuition in math takes time to build up, and requires considerable mental effort.
I'll go a different tangent from the other replies, which is there's a whole lot more to math than two years.
By analogy a business school IT degree is two years of cs plus a bunch of biz classes instead of compilers and automata theory (it varies, huge simplification, etc) However, its possible to study CS at higher levels for immensely longer than the two years an IT grad will get.
In real life, a software engineer can finish his studies and get a job with only a tiny knowledge of math, in every country, regardless what the curriculum says...
Well, it's a bit of an abuse of the term "engineer". It's really sad to see how software people first abducted it, and then degenerated it to a near obliteration.
I develop software since 1986, never needed such dummy tests to be hired.
Companies choose employees, but employees can also chose companies which can recognize the value someone brings in, besides a few programming exercises on a sheet of paper done in 1 hour interview.
As for Fizz Buzz, never bother coding it. I see no value.
> I develop software since 1986, never needed such dummy tests to be hired.
I've never personally had a FizzBuzz test, but I also have no doubts about my ability to solve it in my sleep.
I'm against trick interviews or algorithms quizzes, but FizzBuzz is again just establishing a ridiculously low baseline that you know the most essential aspects of programming. Someone a few weeks into CS 101 should be able to do it, so any professional developer who can't deserves to be immediately laughed out of the room.
FizzBuzz is only a "dummy test" in the sense that only dummies will fail it.
I am doing this too, with Macdonald's "Linear and Geometric Algebra" and "Vector and Geometric Calculus" along with "Networks, Crowds, and Markets" and "Chaos and Nonlinear Dynamics." It's slow going, but very enjoyable.
The most interesting side effect I've noticed from doing this is a dramatic improvement in my ability to focus. I had been suffering from a general scatterbrained, distracted feeling for a while before I started. I think it was due to the way I consume content online, trying to follow too many interests at once. I set aside about an hour a night to work on these math courses and within a couple of weeks I noticed that my focus was greatly improved. For that reason alone I'm counting this project as a major win, regardless of whether I master these topics or not. Having something to study seems to be extremely valuable for general mental health, at least for me.
Given everyone recommending books for math autodidacts interested in learning the "breadth" of mathematics, I cannot recommend more highly Saunder's MacLane's "Mathematics, Form and Function".
It reads as a high level false-historical account of how math might have been developed, in its entirety, if it unfolded in the most direct and logical way from the human experiences we all share (guided by understanding of where mathematics has ended up in the modern age).
The false-historical expose is fantastic. The goal becomes less to express things as accurately and comprehensively as modernly possible—which you'll experience as you take on graduate level books or modern papers—but instead to demonstrate large-scope intuition for why whole fields evolved as they did and how they are all intermingled.
Going back to the "breadth" recommendations, if someone wants a real appetizing sushi sampler of the breadth of Mathematics, I would highly recommend "The Joy of X" http://www.amazon.com/Joy-Guided-Tour-Math-Infinity/dp/05441...
It covers a number of topics in Mathematics and makes them really easy to follow for someone with no background in Maths. I especially liked how in a few pages I succinctly understood div, grad and curl from vector calculus.
For anyone wanting to start out on a full course meal of Mathematics this would make a very appetizing sampler.
As a software developer who also started learning Mathematics for the same purpose (Machine Learning), I just could not do it with books. It is strange to explain, but I need someone with me to study.
I need that person to say: "I don't understand it". Even if I don't understand it myself, as I try explaining it to my study partner, it starts clicking in my brain. I suddenly start to understand these parts as I'm explaining it. Hands down the best way for me to learn: explain to others.
I took Andrew Ng's Machine Learning course on coursera together with a friend. I had the software engineering skills, he had the math skills. Together we strived through all the excercises with a lot of discussions, helping each other and of course we had some disagreements which we learned even more from.
For information theory, a really recommend: Thomas M. Cover, Joy A. Thomas "Elements of Information Theory" - it starts elementarily, has direct examples, straightforward definitions and readable notation.
It's interesting someone who studied Software Engineering didn't get a solid math education at university.
I studied Software Engineering, and am a certified engineer. I sat next to the Aerospace, Chemical and Mechanical engineers for all 4 years of "Engineering Math". (They did crazy physics and chem that I didn't)
How can you call yourself a "Software Engineer" without having studied a huge amount of math?
Great article. I've recently been self-teaching myself machine learning the same way. I found that it started to get difficult in the middle of the course staying on track.
I decided to "scratch an itch" in open source parlance and start a study group I call colearning. Colearning provides a place for people doing remote education (or working on their own projects or writing) to have structured time to study and move forward towards their goals. It helps provide the positive pressure that a real physical class provides.
Our colearning group currently meets at Borderlands Books in San Francisco on Sunday's from 11 am to 4 pm. Message me and you can come join!
I've done this to help provide the structure I need to keep learning; its already helped me advance greatly in my own studies as I make forward progress every week.
Years ago I also identified a need for community while being self employed and started the coworking movement.
I don't mean to go off track here, but as someone who's been casually interested in math, I'm wondering which branches of math are the most applicable to software development. I've taken Discrete Mathematics, Stats, and Calc in college, but now I've been working in the field for a few years and want to try learning a bit more. I've been thinking of trying some courses on Cryptography, but it would be a major shift from my current background (web dev).
I feel like there's so many different directions to take, but which ones are the most applicable to practical Software Development? Particularly in the realm of Web Development?
Editing my question to ask what's the most applicable math subjects regarding the _PRACTICAL_ applications of software development, hopefully clears it up a bit.
I feel like that might be the wrong question. If you wanted to really study formal computer science you could learn things like model theory. If you want to study the analysis of algorithm run times you may want to study something like complex asymptotics. Do you want to study math for computer science or do you want to study math that helps in most applications of computer science (graph theory, linear algebra, etc etc).
Thanks for this. Edited the question a little to hopefully clarify things. I'm mostly going back to Coursera just out of pure curiousity, but was hoping I could take something that has some relevance to my career in Web Development (practical side, so more of your latter statement).
Sounds like Linear Algebra would be good to learn.
Linear Algebra is definitely one you can make use of. When I studied it at uni I did it from a mathematical standpoint and only some years later discovered how useful it is in computing.
Have you done any study of Graph Theory? I've found that a surprisingly large number of problems in computing can be morphed into graphs. I was working on an issue last week (in a web app) that turned out to be a variant of Exact Cover which straight away gave me a large body of literature / algorithms to pull from - and told me I was in a slightly dark place :). Graphs and the algorithms around them feel very close to the work we do with computers, so it will probably be pretty approachable given your background.
My advice would be to not worry too much about how applicable it will be to web development and just pick something to study. Ultimately, it will end up being useful in time. And maths is just beautiful.
Probability, convex and nonlinear optimization, linear algebra, and statistics are clear wins in many problem spaces, especially machine learning, AI, and computer vision. Ordinary and partial differential equations and numerical methods are heavily used in process modeling and HPC for problems in science and engineering. Geometry, curvature, and matrix theory are heavily used in computer graphics to model shape and movement, as well as computer vision. Number and group theory are the basis of crypto, I believe. In general, calculus and its variations are used to represent and describe solution techniques theoretically before mapping them to discrete solutions using numerical methods.
Computer science is pretty broad. What do you want to do? Just because I use linear algebra all the time for machine learning doesn't mean it will help you (that much) with computational complexity analysis.
I am currently taking a great Coursera course that is meant as a mathematics refresher course. It's called "Mathematical Methods for Quantitative Finance"[1]. As the name implies, it's meant as a foundation for financial calculations like options pricing, but a lot of it is just pure maths, like limits, derivation and integration.
The lectures are quite clear and easy to follow. I did do all this math at university, but that was over 20 years ago, so I had forgotten a lot, so for me this course is perfect. But I think it can be quite useful even if the concepts a re new to you.
I've been doing project Euler problems and I realise it really really helps to know Number Theory to get to an optimal solution for problems. Can anyone recommend good books on the subject? While learning via problem-solving is fun, most of the time it boils down to me sitting with a naive, brute-force solution that's too slow and then google how to make it fast, discover a new number theory axiom/tool and then changing the algorithm. There's gotta be a better way to learn...
The comments are amusing. It seems there's a popular opinion that to be an engineer requires a higher understanding of mathematics than what most mere mortals require.
My understanding was that to call yourself an engineer one must be capable of building robust and efficient things, whether that thing is a plane, or a bridge, complex software, even a website, matters not, so long as it is robust and efficient.
The degree to which you understand mathematics is useful only in how greatly it can enable you to build something robust and efficient. Thus, if you know more linear algebra than 90% of all mere mortals, but write inefficient programs, then at best you are a bad Software Engineer, at worst you're simply not a Software Engineer.
For that reason, at my current job I was asked to replace a man who earned his PhD in Computer Science from Georgia Tech who's job was to build an maintain an API in Go, despite the fact that I'm a college dropout. Smart man, with a far better understanding of mathematics, but ultimately I could build things more robustly and efficiently. I had stronger engineering skills.
Going back to OP, right on man. I personally didn't enjoy math until I learned it using my own rules, by own rules I mean I shunned the established method of repetition until you memorized the process, instead opting to understand the underlying concept and keep my thoughts written away somewhere for future reference, knowing all too well by now that memorization is futile given how frequently I have to reference Stack Overflow.
As an electrical engineer, I like to quip that 99% of the work I do requires only V=IR, or slight variations / derivations thereof.
Of course, it's that remaining 1% that will fuck you if you don't have the maths chops. And this is as a consulting engineer, which arguably uses the least maths. Once you get into actual design or analysis, that number becomes a sliding scale in the other direction very quickly.
> It seems there's a popular opinion that to be an engineer requires a higher understanding of mathematics than what most mere mortals require.
> build an maintain an API in Go
Unfortunately, a much larger proportion of engineering jobs require mathematics than you make it seem. Software engineers seem to forget that engineering includes designing planes, bridges, cars, lasers, electrical circuits, materials, and a million other things that require you know mechanics, statics, statistics, logic, linear algebra, differential equations, etc.
I clearly brought up designing planes and bridges, I figured both were enough to demonstrate that my definition of engineering includes the structural, the mechanical, the electrical, etc.
The point is, the engineer who designs the plane only needs to know enough math to design said plane. The engineer who designs the bridge? The same. Theoretically, they're supposed to understand a great deal. Realistically, the degree to which technology automates the mundane task of calculation could very well mean that their daily application and actual understanding of mathematics is overestimated anywhere from slightly to greatly. This will be especially true as time goes on and the tools of the trade become even more advanced.
To design a web application that operates efficiently in terms of cost, response time, and required maintenance, along with a number of other variables, requires good software engineering. Once upon a time, this required strong mathematical aptitude. Now, it requires more of an understanding of the language used for the design of the application itself, an ability to justify why one approach should be faster through algorithm analysis and efficient testing, logical thinking, etc, it certainly doesn't require the mathematical aptitude to design your own cryptography protocol whenever you can simply use a library demonstrated to be safe and reliable, although an understanding of the underlying concepts of how said protocol works might be nice, if anything to understand why you shouldn't roll out your own.
It's not too far off to imagine a near future where the average <insert engineer here> functionally requires the same degree of mathematical aptitude as today's current web developer.
Right. And software developers usually don't realize that engineering is all about design and implementation tradeoffs that fit within quantitative tolerances that deliver a specific level of accuracy, precision, and reliability. Software 'engineering' is nowhere near mature enough to do achieve this, and all too often, unaware of it.
Do you have issues remembering what you learn? Sure, we've all studied these topics before in school but have forgotten since. I brush up on some geometry or linear algebra every now and then for work. But I'm sure I'll forget if I don't use it after a year or two.
Are there any projects you maintain to keep your knowledge fresh?
I hated math from elementary up until i learned Fourier theory. That was a pivotal moment in my education ; math went from wrote excercise to pure beauty in a single lecture. Now im trying to self learn lie algebra. I need to find out what prerequisites i need. A ton of math at this level is notation and terminology.
Same here. Linear Algebra is what got me from "math is boring, and I only learned things to pass a test" to "math is really cool, and can actually solve these problems in real life"
Has anyone had any experiences learning different mathematical techniques using Mathematica? I was wondering if it might be more enjoyable to learn some areas without the manual arithmetic usually involved in school. Any recommendations on books or courses appreciated, thanks!
There's a pretty good Linear Algebra course currently being offered on edx.org. I just made it past week 1 with relative ease and in the next few weeks we will be using Mathematica. (Every student gets a license for the duration of the class)
originally a die-hard math major and now, hacking and dealing with hackers, I have and to question the foundations of my own subjects. I have come to some tentative conclusions:
computer science is computer program, when you abstract away all context so it is just a bunch of symbols
statistics is mathematics applied to the "real world" data, and the art of turning it into form suitable for computer.
Actually, from Lao Tzu's "Art of War", he put's it this way:
In respect of military method,
firstly, Measurement;
secondly, Estimation of quantity;
thirdly, Calculation;
fourthly, Balancing of chances;
fifthly, Victory.
I didnt really learn math until I had to use it in my work.
For example my thesis worked used a lot of complex analysis and wave equations. I had had those in Physic and math calss, but really didnt obtain full understanding until using it.
The article title could be "Why and How to Learn On Your Own" and the points it makes are absolutely correct. Constantly find new things to explore and learn, and never be intimidated by just how daunting those things may seem.
I am software engineer for last 10 years, I am interested in Phd in statics/Math and live in silicon valley. what is the starting point for any courses or degree, any recommendation would help.
> Why not quit my job and go back to
school? Well, that’s not really for me.
Reading books at my own pace lets me try a
subject out without fully committing to it
and making it a necessity that I find work
based off of it.
I respond here in three parts, the first
general, the second more specific and
about linear algebra, and the third about
statistics.
Really, in school, you still have to learn
the stuff. And, there a class might help
but won't be enough; so, right, again,
during the course and in the hours not in
the class, you still have to study and
learn the stuff. Or for the class,
"mathematics is not a spectator sport".
That means you still have to study the
stuff, get it between your ears,
understand it.
Eventually you can conclude that mostly
college and its courses are more for
certification than education. For the
education, that's heavily up to you.
But there are some dangers in self
learning: Not all the ideas and learning
materials are good; the really good ones
are only a small fraction of all the ones
you will likely encounter.
So, in praise of college and profs, they
can (1) get you on a good track with good
ideas and learning materials and (2) get
you unstuck and keep you on track. In
such study, it's possible to do too much
or too little -- from some experts you can
see about what is the right amount to do.
I said "college"; revise that to read "one
of the world's best research
universities", say, in the US, 1-2 dozen.
The ideas and materials you will see in
such a university really do stand to be
better than nearly all you will encounter
otherwise. Such learning is where there's
not much substitute for quality. But,
still, just to learn the material, you
don't really have to enroll in such a
university and, instead, just borrow from
their course descriptions and materials.
Indeed, some of the best such universities
are working hard to make their learning
materials available to all for free over
the Internet. Why? Because those
universities want to concentrate on
pushing forward with research.
Here's a way: Show up at such a
university and appropriate department, in
your case, applied math, mathematical
sciences, operations research, statistics,
whatever. Maybe show up at some public
department seminars. Talk with some of
the students. Say you have a career
going, for your career are interested in
what the department is doing, and want to
learn more about the program and the
mathematical content. So, get some of the
students to talk and explain.
Then for some of the courses you are
interested in, see who the profs are and
look at the course materials, say, texts,
handouts, on-line files, etc.
Then after looking at the materials and,
say, have made some progress with them,
try to get 15 minutes to chat with a prof.
Then take what you just got from that
department for free -- broad directions,
what they regard as more/less important,
texts, course materials, etc. -- and go
off and study on your own. When you think
that maybe you have some course studied
well, try to get a copy of the course
final exam or Ph.D. qualifying exam, work
through it, and, if it appears you did
well, ask a prof to check your solution to
a few of the most difficult questions. If
your solutions look good, then you will
start to look good and may get asked if
you would like to apply as a student in
the department. So, here you and the prof
and department are interviewing each
other.
Such things worked for me: (A) In my
career I kept running into the work of
John Tukey at Princeton and Bell Labs.
So, it was stepwise regression,
exploratory data analysis, power spectral
estimation, convergence and uniformity in
topology, his statement equivalent to the
axiom of choice, etc. So, I wrote him at
Princeton asking about graduate study,
mentioned those topics, and got back a
nice letter from the department Chair
basically inviting me to apply.
(B) I applied to Cornell and got rejected.
But largely independently I visited a prof
there to discuss optimization, asked about
being a grad student, and soon got another
letter accepting me to grad study.
(C) At least at one time, the Web site of
the Princeton math department said,
"Graduate courses are introductions to
research by experts in their fields. No
courses are given for preparation for the
qualifying exams. Students are expected
to prepare for the qualifying exams on
their own." or some such.
Lesson: In grad school at Princeton, are
still expected to learn the qualifying
exam materials on your own. Well, to do
that, don't have to be in the high rent
area around Princeton, NJ.
When I did go to grad school and got my
Ph.D., what really saved my tail feathers
was what I had done and did on my own as
independent study. Some of the courses
helped in providing high quality
directions and materials, but the real
work was nearly all independent.
And, one of the crucial inflection
points was when I took a problem in a
course but not solved in the course, did
some research, and found a solution.
The solution was novel, and word spread
around the department quickly. My halo
got a high polish, and that greatly eased
my path through my Ph.D. That is, I had
proven results on the most important
research academic and Ph.D. bottom line
-- I'd done good, novel, "new, correct,
significant" research. Then my hair cut
or lack there of, sloppy hand writing,
occasional upchuck at some bad course
material, etc. no longer mattered.
My research looked publishable, and it was
-- I did publish it later, in one of the
best journals, easily, no revisions. So,
lesson: That inflection point was from
independent work.
So, don't feel that your independent work
is inferior to enrolling as a student.
Instead, in the best universities, as a
student, nearly all the work is for you to
do independently anyway.
Still, I'd repeat -- try to pick the
brains, for free, without hurting your
present career, of the courses, profs, and
materials at a world class department in a
world class research university.
Why a research university? World class
research is a very high bar and some of
the best evidence of good expertise,
insight, and judgment in the field -- and
you don't want the opposites.
After freshman calculus, it would be good
to start with abstract algebra. There
you will get handy with (relatively
simple versions of, and, thus, a good
place to start -- and, I have to say, put
you ahead of a surprisingly large fraction
of the best chaired professors of computer
science) theorems, proofs, sets, axioms,
groups (once I published a paper in
multi-variate, distribution-free
statistical hypothesis tests where the
core of the math used some group theory),
rings, fields, the integers, rationals,
reals, and complex numbers and their
leading properties, some important
algorithms, e.g., Euclidean greatest
common divisor (also the way to find
multiplicative inverses in the finite
field of the integers modulo a prime
number and, thus, the core of a super cute
way to do numerical matrix inversion
exactly using only short precision
arithmetic), number theory and prime
numbers (crucial in cryptography), vector
spaces (the core of multi-variate
statistics and more), and some of the
classic results. Some of this material is
finite mathematics at times of high
interest in computing -- e.g., error
correcting coding. For such a course, a
good teaching math department would be
good. Have a good prof read and correct
your early efforts at writing proofs --
could help you a lot.
But starting with linear algebra is also
good. There are lots of good books. The
grand classic, best as a second text, is
P. Halmos, Finite Dimensional Vector
Spaces. He wrote this in 1942 after he
had gotten his Ph.D. from J. Doob (long
the best guy in the US in stochastic
processes) and was an assistant to von
Neumann at the Institute for Advanced
Study (for much of the 20th century a good
candidate for the best mathematician in
the world). The book is really a finite
dimensional introduction to von Neumann's
Hilbert space theory.
von Neumann is the guy on the right. The
guy on the left is S. Ulam (has a cute
result the French mathematician LeCam once
called tightness I used once). The guy
in the middle is just a physicist! Of
course, in that picture they were working
up ways to save 1 million US casualties in
the Pacific in WWII and were astoundingly
successful. Ulam is best known for the
Teller-Ulam configuration which in its
first test yielded an energy of 15 million
tons of TNT. There are rumors that von
Neumann worked out the geometry for the US
W-88, 475 kilotons in a small package as
at
Von Neumann also has a really nice book on
quantum mechanics the first half of which
is a totally sweetheart introduction to
linear algebra.
Of course, Ulam was an early user of Monte
Carlo simulation, still important.
Other linear algebra authors include G.
Strang, E. Nering, Hoffman and Kunze, R.
Bellman, B. Noble, R. Horn. Also for
numerical linear algebra, e.g., G.
Forsythe and C. Moler, the LINPACK
materials, etc. There are free, on-line
PDF versions for some of these. Since the
subject has not changed much since Halmos
in 1942, don't necessarily need the latest
paper copy at $100+!
For statistics, that is a messy field. It
has too many introductory texts that over
simplify the subject and not enough well
done intermediate or advanced texts.
Also the subject has essentially a lie:
They explain that a random variable has a
distribution. Right, it does. Then they
mention some common distributions,
especially Gaussian, exponential, Poisson,
multinomial, and uniform. Then the lie:
The suggestion is that in practice we
collect data and try to find the
distribution. Nope: Mostly not. Mostly
in practice, we can't find the
distribution, not even of one random
variable and much less likely for the
joint distribution of several random
variables (that is, of a vector valued
random variable). Or, to estimate the
distribution of a vector valued random
variable commonly would encounter the
curse of dimensionality and require
really big big data. Instead, usually
we use limit theorems, techniques that
don't need the distribution, or in some
cases make, say, a Gaussian assumption and
get a first-cut approximation.
Early in my career I did a lot in applied
statistics but later concluded I'd done a
lot of slogging through a muddy swamp of
low grade material.
A clean and powerful first cut approach to
statistics is just via a good background
in probability: With this approach, for
statistics, you take some data, regard
that as values of some random variables
with some useful properties, stuff the
data into some computations, and get out
data that you regard as the values of some
more random variables which are the
statistics. The big deal is what
properties the output random variables
have -- maybe they are unbiased, minimum
variance, Gaussian, maximum likelihood,
estimates of something, etc.
For this work you will want to know the
classic limit theorems of probability
theory -- weak and strong laws of large
numbers, elementary and advanced
(Lindeberg-Feller) versions of the central
limit theorem, the law of the iterated
logarithm (and its astounding application
to an envelope of Brownian motion), and
martingales and the martingale convergence
theorem ("the most powerful limit theorem
in mathematics" -- it's possible to have
making applications of that result much of
a successful academic career). And,
generally beyond the elementary statistics
books, you will want to understand
sufficient statistics (and the
astounding fact that, for the Gaussian,
sample mean and variance are sufficient
with generalizations to the exponential
family) and, also, U-statistics where
the order of the input data makes no
difference (and order statistics are
always sufficient). Sufficient statistics
is really from (a classic paper by Halmos
and Savage and) the Radon-Nikodym theorem
(with a famous, very clever, cute proof by
von Neumann), and that result is in, say,
the first half of W. Rudin, Real and
Complex Analysis (with von Neumann's
proof).
Also with the Radon-Nikodym theorem, can
quickly do the Hahn decomposition and,
then, knock off a very general proof of
the Neyman-Pearson result in statistics.
How 'bout that!
Thus, to some extent to do well with
statistics, both for now and for the
future, especially if you want to do some
work that is original, you will need much
of the rest of a good ugrad major in math
and the courses of a Master's in selected
topics in pure/applied math.
So, for such study, sure, at one time
Harvard's famous Math 55 used the Halmos
text above along with W. Rudin,
Principles of Mathematical Analysis
(calculus done very carefully and a good
foundation for more), and Spivak,
Calculus on Manifolds, e.g., for people
interested in more modern approaches to
relativity theory (but Cartan's book is
available in English now). It may be that
you are not interested in relativity
theory or the rest of mathematical physics
-- fine, and that can help you set aside
some topics.
Then, Royden, Real Analysis and the
first half of Rudin's R&CA as above,
along with any of several alternatives,
cover measure theory and the beginnings of
functional analysis. Measure theory does
calculus again and in a more powerful way
-- in freshman calculus, want to integrate
a continuous function defined on a closed
interval of finite length, but in measure
theory get much more generality.
And measure theory also provides the
axiomatic foundation for modern
probability theory and of random
variables. Seeing that definition of a
random variable is a real eye opener, for
me a life-changing event: Get a level of
understanding of randomness that cuts
out and tosses into the dumpster or bit
bucket nearly all the elementary and
popular (horribly confused) treatments of
randomness.
Functional analysis? Well, in linear
algebra you get comfortable with vector
spaces. So, for positive integer n and
the set of real numbers R, you get happy
in the n-dimensional vector space R^n.
But, also be sure to see the axioms of a
vector space where R^n is just the leading
example. You want the axioms right away
for, say, the (affine) vector subspace of
R^n that is the set of all solutions of a
system of linear equations. How 'bout
that!
Then in functional analysis, you work
with functions and where each function is
regarded as a point in a vector space.
The nicest such vector space is Hilbert
space which has an inner product
(essentially the same as angle or in
probability covariance and in statistics
correlation) and gives a metric in which
the space is complete -- that is, as in
the real numbers but not in the rationals,
a sequence that appears to converge really
has something to converge to. Then wonder
of wonders (really, mostly due just to the
Minkowski inequality), the set of all real
valued random variables X such that the
expectation (measure theory integral)
E[X^2] is finite is a Hilbert space,
right, is complete. Amazing, but true.
Then in Hilbert space, get to see how to
approximate one function by others. So,
in particular, get to see how to
approximate a random variable don't have
by ones you do have -- might call that
statistical estimation and would be
correct.
Then can drag out the Hahn-Banach result
and do projections, that is, least
squares, that is, in an important sense
(from a classic random variable
convergence result you should be sure to
learn), best possible linear
approximations. And maybe such an
approximation is the ad targeting that
makes you the most money.
So, that projection is a baby version of
regression analysis. There's a problem
here: The usual treatments of regression
analysis make a long list of assumptions
that look essentially impossible in
practice to verify or satisfy and, thus,
leave one with what look like unjustified
applications.
Nope: Just do the derivations yourself
with fewer assumptions and get fewer
results but still often enough in
practice. And they are still solid
results.
For the usual text derivations, by
assuming so much, they get much more,
especially lots of confidence intervals.
In practice often you can use those
confidence interval results as first-cut,
rough measures of goodness of fit or
some such.
But the idea of just a projection can
give you a lot. In particular there is an
easy, sweetheart way around the onerous,
hideous, hated over fitting -- it seems
silly that having too much data hurts, and
it shouldn't hurt and doesn't have to!
And the now popular practice in machine
learning of just fitting with learning
data and then verifying with test
data, with some more considerations which
are also appropriate, can also be solid
with even fewer assumptions.
Fabulous write up. Do you somewhere besides here to publish it so it's easier for people to wander across in the future? I'd be interested to see what else you've written.
I read this the first time posted (didn't get any comments), very poignant, especially the part about relearning trig/geometry, precalc. There's boundless resources for learning now but you don't get little endorphin/epinephrine releases like you do when your gcc/clang/VS compile succeeds, it's still mostly notebooks, whiteboards, pencils and 4-color pens (tho i've seen lots of cool JS animations, ipython notebooks, and libs in R, matlab/octave and now julia)
- besides Dover, Schaum Outlines are a good cheap resource abundantly available in used bookstores(tho there are in fact some type-ridden ones also)
____________
the best advice general advice i've seen is the same as what they tell you in college: form study groups and make commitments to regular discussion. Stronger students strengthen their understanding by tutoring others at the whiteboard. There's lots of machine learning and data sciencey meetups and informal groups springing up e.g.http://machine-learning.meetup.com/
This guy claims to be a software engineer but then asks this question, "Why spend your spare time learning math, which is challenging and sometimes dry?"
The question invalidates anything he has to say on the subject and makes me question his claim of being a software engineer. No self-respecting person in that position would ever need to ask that question or write an article about it. Nor is it worth wasting anyone's time to read past the first paragraph.
Margaret Hamilton coined the term "software engineer." Anyone not doing landing-humans-on-the-fucking-moon level work can have their claim to the label questioned under any absolute standard.
I started a Math degree after 16 years of programming without any Math beyond high school (the highest being high school calculus). Most of my work as a software developer didn't require any "higher" Maths.
Once I began studying math, including Modern Algebra, Analysis, Graph Theory, Category Theory, etc., I realized I understood many topics on an informal level, in a non-rigorous sort of way, through programming. I had a good sense of major algorithms and data structures as well as their running times. Once I did have more math under my belt, things did become easier, and I started to see connections and commonality between problems across different domains, i.e. more than one way to skin a cat.
Part of the reason I began studying math, is that I felt it was my limiting factor. The range of problems I could tackle as a programmer was limited by math. It turns out this was partly true.
The biggest misconception is that in Math there is one "correct" answer. This is almost never the case. Some of the most interesting solutions in Computer Science come directly from Math topics that were once considered "abstract". Likewise, some of the most interesting problems are solved through approximation algorithms of seemingly intractable problems, often requiring a bit of "hacking" and real world experience beyond what you'd get from a formal education in Math or Computer Science.
[1] https://news.ycombinator.com/item?id=7104566