While there are clearly differences between languages, I think it is rather deterministic to put that much weight on the first language? My first language was GW Basic. By your logic, there wouldn't be much hope for me I guess..
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." - Edsger Wybe Dijkstra
It is always amazing to me how much of a mix Dijkstra was. Many of the hand-written articles of his I read are filled with valuable insights, and are absolutely worth the read. And then there's those other sets of comments (like this one) that make me want to pay no attention to anything he had to say.
Of course, this is the guy who said "Computer Science is no more about computers than astronomy is about telescopes." And while I believe I understand his point he was trying to make, I have much higher respect for a person who still likes to get their hands dirty (still actually spends significant time programming computers) rather than just dealing with abstract theory, algorithms, analysis, etc. Yes, I know that Dijkstra knew how to program (and did so, extensively, especially earlier in his career). But this is the guy who never owned his own computer -- even after personal computers became commonplace. That, coupled with the above comment about BASIC (and other comments I've read from him like it) make him come across as kind of an elitist -- like programming is beneath him or something.
Perhaps my take is wrong, and it very well could be. But I will say this. If Dijkstra was still alive and well, and you put him, Guy Steele, and Don Knuth in a room and asked me to pick two of the three to spend the day with, Dijkstra would be the one left out in the cold.
If Dijkstra was still alive and well, and you put him...
No harm, Dijkstra never struck me as the kind of person that would want to spend a day in a room with anyone, either. Computer Science is about studying the computation which can be done in hardware - not just Python, not just on one chipset, not just on a von Neuman machine, but in general. "Programming" as I think you mean it is a very small subset of that.
I was not arguing that programming is all there is to Computer Science (which is why I said I understood the point Dijkstra was making with his astronomy quote). However, I disagree with the notion that programming is "a very small subset" of Computer Science -- even though I think Dijkstra would probably agree with you on that. I think programming is a large subset of Computer Science. Yeah, there's a lot more to it than that, but in my book the practical side of things is more important than Dijkstra seemed to think it was.
He was obviously brilliant and very perceptive with regard to many things in the field. However I still think it is over the top that the guy didn't even own a computer.
There are many who would rather spend their careers just writing applications without ever analyzing the differences between two algorithms, much less ever studying or thinking about computation in general. Then there are those who would spend their careers strictly on theoretical endeavors, and do not wish to continue writing applications or software systems of significant value. It is my personal opinion that the best Computer Scientists are the ones that regularly do both.
> If Dijkstra was still alive and well, and you put him, Guy Steele, and Don Knuth in a room and asked me to pick two of the three to spend the day with
Or -- and more relevant to their professional competence -- to develop a large, complex and safety-critical system, such as air traffic control; I wouldn't choose Dijkstra over Knuth or Steele for that job either.
Dijkstra had an unfortunate fondness for the epigram. Cumulatively the epigrams have had the effect of making it easier to remember him as the guy who said bad things about GOTOs, Basic, Fortran, and OS-360, rather than as a pioneer in algorithms, concurrency, and programming languages. Still, I suppose it is something to be able to say pithy things in one's second (or at least not first) language.
I think it's much more important what you choose as a second language. If it is significantly different from the first language in adding new concepts and ways thinking about programming, there is a good chance that between the two of them learning any other new programming language as needed won't seem as daunting. And you will have a wide range of techniques at your disposal regardless of which language you happening to be using in a given project.
My first language was C, my second was C++, my third Java. I lost all love for them the moment I learned Scheme, and the all practical use the moment I learned Python.
I don't think it really matters what language is first/second/etc, just that you keep searching and finding new ways of thinking.
My point is simply that you could have reached your current stage of proficiency as a developer quicker if Scheme immediately followed C. C++, Java, and Python would just seem like relatively minor differences in syntax for concepts with which you are already familiar, assuming you learned C and Scheme well.
Of course, the danger is that it can quickly become frustrating to program in the likes of Java once you realize all the things that are simple and straight forward in, say, Scheme but require ridiculous amounts of circumlocution in Java.
You are among the rare people which are willing to learn, again and again. You'd be surprised how much people (outside HN :-) just won't learn. They are Blub programmers. They will be stuck with whatever Blub language you give to them, even C++. I'm glad you didn't, because I'm not sure I wouldn't.
EDIT: Maybe there no such thing as a Blub language (right tool for the job etc). However, Blub programmers do exist: half of my co-workers and maybe even myself (to some extent).
There is a danger when your first and second language are too different: you may dismiss the second language as being cryptic, especially if it is more concise than the first.
Well, I'm sorry to say I strongly disagree with the mathematics part. Basis in linera algebra are a definitive plus and helped me approach programming in a sensible way.
I started programming before I learned algebra, and was a couple years ahead of the class because solving algebra problems was just "running the computer" as I used to call evaluating code in my head. Substitution was just expanding an inline function (at the time i thought of it like copy+pasting the gosub...)
I think that abstract symbol manipulation within a set of constraints facilitates skill transfer between math and programming. Learning to see paterns, to perform abstractions of statements for simplification, to construct something larger out of component parts bottom-up (like simple proofs,) these are all useful skills in both.
"especially for a web developer" is the important part. As a fellow web developer, I agree. I'd certainly like to get into the mathier side of programming someday, but it's rarely necessary for my day-to-day operations.
Remember, he's writing a letter to his former self, so the advice applies more to his life than it does in general.
The amount of math which you need in programming may not enormous, but it is certainly not limited to boolean algebra, and it needs to be internalized. For instance, if you don't understand that a function is a mathematical object like any other, you won't understand some very useful things in day to day web programming: closures, lambda, `map`, `filter`… Without those, my web site generator[1] would have been much more complicated.
Exactly you can do it without math, but elegance and performance are much easier with math on your side. The one I always find funny is the massive volumes of time money and energy spent "Load Testing" code, when some simple algebra and statistics can rapidly give you a projection of your scalability. Then you only have to run a test to validate you work. Yuo can test load without math but it sure saves a lot of time and money to do it with it.
It is always people that don't understand math and its application that say oh, I don't need it for what I do "web development I am looking at you" and then wonder why their systems are, complex, ill performant, and inelegant.
Please learn math before you say you don't need it. Or more specific to this article stating that programming is complex / hard and following that with you don't need math. It's only hard because you are not using math.
I have done 3D simulation, AI, robotics, embedded and now web over the coarse of my career and the only complex systems that I have run into where designed without the assistance of mathmatics.
No, DBAs tried to come up with a math-like name for writing queries so that they appear to have a more prestigious skillset than they really do. Any high school kid who's written "hello world" can get through databases 101, including expressing queries as relational algebra, in a week.
Try to debug performance problems in queries, or scalability problems across your whole database. Trust me, competence in math will quickly prove very useful.
Of course if you are not competent in math you won't do well at that kind of task, and because you don't know better you won't necessarily recognize your inability for what it is.
not to mention the points on elegance and simplicity may be a manifestation of a lesser understanding of mathematics. I found that after a more competent grasp on advanced mathematics concepts my systems became more ordered and elegant.
Even to individuals with a lesser grasp on certain forms of mathematics. Math helps you find elegant solutions to complex problems, this is hard to convey to someone who does not have the education in mathematics and they only get that they have been doing things the hard way after they have learned mathematics and found that the two are interrelated even for web development.
The thing is the answer is YES, but you won't even understand why unless you have the appropriate math knowledge.
There would be a lot less terrible code written out there if people had a better grasp of discrete math, knew how to make their O(n) analysis, knew how to create and implement provably correct algorithms, knew lambda calculus and e.t.c. and that's just for general programming. I would say a basic grasp of set theory and graph theory applies to almost everything too.
I have discrete under my belt. And a bunch of other stuff too (came a bit short of having a minor in math in college).
And... well, I can't say that I've ever explicitly used any of it in programming. I suspect you're falling into the trap of generalizing from a field you're familiar with to all fields, and that's a generalization that doesn't hold up.
In part the key word here is "explicitly." I have a PhD in Combinatorics and Graph Theory, and I have lectured in Calculus, Group Theory, Functional Analysis and Topology. In my daily work for the past 18 years I have explicitly used that background exactly once.
But the ongoing influence, the style of thought, the ability to visualise and the ability to abstract away from the details - these things I use all the time, every day. They have been enhanced and honed by all that math.
I use my math background implicitly all the time, and I don't know how I'd do what I do without it.
And this is perhaps the most important point about studying math. Often the greatest take-away is the abstract problem solving capability, not the material itself. I've never had to analyse the genus of a manifold in real life, but I have thought about objects moving in an 11 dimensional space with holes, because while everyone else was stuch in the detail, I was seeing things differently. It turned out that the combination of styles was critical to solving the problem.
Often the greatest take-away is the abstract problem solving capability, not the material itself.
That's why I got a degree in philosophy.
OK, not the only reason, but one of them...
Meanwhile, I think my point stands. Too many people seem to have an "OMG you don't use linear algebra every day? What kind of crap programmer are you?" attitude.
I suspect that you've had opportunities to use your math, and had you seen those opportunities you would recognize the potential utility.
I say this as someone who periodically finds yet another way to use my math background to help me do things while my co-workers in the same job don't. Which is why I'm the one who got to do the statistics for evaluating A/B testing, eliminated multiple performance problems others had come to accept, and gets tapped to tackle complex "how do we figure this out" problems from time to time.
And yes, I've had the fun of seeing co-workers who theoretically had the same job as me argue that math wasn't useful for programming, while I was sitting there with a list of examples of how I'd recently used my math background. (But then I got promoted into a different job title and in those discussions those programmers were able to respond that of course I needed more math for reporting, but they didn't need it as web developers. Never mind that I was the guy who got asked to help when the database had trouble scaling at peak traffic of several million pages/hour...)
Really this seems to me to be a variant of the blub paradox. If you haven't learned to reflexively see the benefits of thinking in some new way, you won't recognize how thinking that way could help with the problems you are solving day in and day out. Conversely if you have gained those mental skills, you recognize their utility in situations that other people wouldn't dream they are applicable for.
Not much until you need to actually need to do something with the data, which is usually returned to you in a large, two-dimensional array, basically.
At that point, if you haven't figured out basic algebra at least, you'll have trouble conceptualizing the data. At best, this means you won't be as efficient as you could be, at worst, you'll write buggy code.
There's not a lot of math in plugging forms into database rows, or even in plugging values into MSRs and managing interrupts, but there's enough math in general programming that I constantly regret ditching that part of my education.
I am, with surprising regularity, annoyed that I can't pull basic trig out of my head without looking things up --- to say nothing of signal processing and number theory.
>As a linguistics major, you're no stranger to the idea that a person is only capable of having thoughts and ideas that can be expressed in their language
As a linguistics major, you have no excuse for not knowing that Sapir-Whorf is utterly discredited.
Replying to your post in the recent discussion, http://news.ycombinator.com/item?id=1033741, I don't understand your reasoning. Sure, you can extend a language introduce a concept. But the lack of that concept can still have molded speakers' thinking.
The problem is that there isn't one "Sapir-Whorf hypothesis". There's a spectrum of such hypotheses, running the gamut from the mild "language has an affect on how we think" up to the extreme "if your language uses the same word for blue and green then you'll be blue/green colorblind".
The extreme forms have been rather thoroughly debunked at this point. The milder forms verge on tautologies.
"As a linguistics major, you're no stranger to the idea that a person is only capable of having thoughts and ideas that can be expressed in their language, and there is no reason to expect programming languages to differ from spoken languages in this area."
I thought this idea (the Sapir-Whorf hypothesis) had been discredited.
It's ok to suck at maths if your idea of maths is limited arithmetic, something computers are indeed very good at and your idea of programming is limited to hooking up web forms to databases.
I'm not a linguistics major but I do speak several human languages and have no trouble thinking in them and expressing ideas in them. The first computer languages I learned (6502 assembly, BASIC) don't enter my conscious thought when I think about the programming problems I encounter with the languages I use today.
Look, I'm sorry to be the one to break this to you, but if you have difficulty with any programming concept, you must not be a supergenius. You're just an ordinary genius at best. I'm sorry. Life isn't always fair.
Of course, I say this as someone who hasn't yet tried to learn Haskell. On the other hand, I know someone who competes at the national level and I never saw him have trouble with anything including Haskell, so...
The sad truth is that there are some people for whom programming comes as naturally as thinking, with code formed as easily as thoughts; and if it takes an effort to understand any aspect of programming, you have just learned that you are not one of those people. Alas.
"Programming is not always intuitive, it's inherently complex, and it's challenging. Once you start feeling like you've gotten a handle on it, you'll learn something new that will make things seem even more complex for a while."
This applies to pretty much any field - engineering, physics, chemistry, even music!
My background is in electrical engineering and it's quite daunting to realize how little I REALLY understand when it comes to the fundamentals... Sure an engineer can make things "go" but they're standing on the shoulders of giants.
What about game programming? Nobody mentioned it. It's the ultimate test. Try to hack together a simple pool game. You'll be amazed of how much maths and physics go into a simple game that millions use and enjoy.
'At this point in the book, I was originally going to present a BSP-based renderer, to complement the BSP compiler I presented in the previous chapter. What changed my plans was the considerable amount of mail about 3-D math that I’ve gotten in recent months. In every case, the writer has bemoaned his/her lack of expertise with 3-D math, and has asked what books about 3-D math I’d recommend, and how else he/she could learn more.
That’s a commendable attitude, but the truth is, there’s not all that much to 3-D math, at least not when it comes to the sort of polygon-based, realtime 3-D that’s done on PCs. You really need only two basic math tools beyond simple arithmetic: dot products and cross products, and really mostly just the former. My friend Chris Hecker points out that this is an oversimplification; he notes that lots more math-related stuff, like BSP trees, graphs, discrete math for edge stepping, and affine and perspective texture mappings, goes into a production-quality game. While that’s surely true, dot and cross products, together with matrix math and perspective projection, constitute the bulk of what most people are asking about when they inquire about “3-D math,” and, as we’ll see, are key tools for a lot of useful 3-D operations.'
(Michael Abrash, "Graphics Programming Black Book Special Edition")
Not much math. Some simple physics like velocity and momentum transfer that you can learn in a tutorial (in a day). I think it is important to make the distinction between game development (lots of simple math and complex "pluggable" formulas) and game engine development (yes, you need to understand linear algebra,trig, and perhaps calculus).
I believe, when the parent said "hack together a basic pool game," they meant "hack together just enough of a game engine to write a pool game on top of it, and then write said pool game atop it."
No, game development includes 'game engine development' - the majority of companies write their own engine or have heavily modified a licensed engine. And you often need to understand quite a bit of math to use even off the shelf engines - to debug issues and to tweak stuff.
Perhaps you're referring to what industry folk call gameplay programming? There's a lot less hardcore math there - but you still typically need to understand basic physics, trig, interpolation, etc.
Most of the time working with a game engine does not involve inventing any new mathematics, so there is an upper bound on just how hardcore it can really get :)
The bit about being constrained by your first language is demonstrably not true (read pg's own account!). It can be a burden, but what stops people from progressing isn't this, it's the usual suspects: arrogance and ignorance. Once you stop judging a language purely on its merits and, thinking you've found the best, begin evangelizing it you will have problems seeing more powerful ones (because the language has become part of your id).
You have to treat a programming language like a great chess player treats possible moves: when you find a great one, sit on your hands and look for a better one.
As far as math: in my experience it isn't required. It will make you better and make your work easier. I've had good math people replace whole algorithms of mine with a couple of math statements. But if you really devote yourself to getting better at programming, learning a lot of diverse languages and so on, your math will get better. I've found it easier to learn certain math concepts from related programming concepts that I had already learned.
Programming isn't hard, programming is fun! Ok, it is hardish sometimes, but hard in a fun way not hard in the non-fun way this article seems to imply.
And I really don't think your first language is all that important, programming is still fun usually, whatever the language. It's only later that we learn the fine art of language snobbery ;)
Absolutely, but there is fun-hard and there is hard-work-discouraging-hard and this article seemed to me to be talking more about the latter. In fact I'd say that non-hard programming usually gets not-so-fun (although it still has it's charms)
No, really, sometimes it's "hard-work-discouraging-hard". I don't know anyone who thinks it's fun to pour through strace or tcpdump output trying to figure out obscure bugs at the OS or network layers. I've known a couple people who liked looking at generated assembly to debug compiler issues, but they are a rare breed. Race conditions or 1 in a million bugs really suck. Trying to debug any of these things while your company hemorrhages money or the phones are ringing off the hook with pissed-off customers; any "fun" you're having is just the adrenalin trying to keep you from being eaten by a tiger.
Working on hard problems, of your choosing, on your own schedule, can be fun and rewarding. But the reality is, you're not always going to get that.
I didn't get as far as you in figuring out what it was talking about. Mostly it made me think of Wolfgang Pauli's famous quip - 'It is not even wrong'.
I didn't need any advanced math to program until I started tackling computer vision problems.
Estimating 3D surface normals and depth from multiple photos of an object? Break out the matrix solvers. Computing homographies between images? Better know what an eigenvector is.
My first experiences with programming must have been very different than his, with respect to the first section in particular. I always knew that programming was supposed to be hard - I grew up knowing no programmers, teaching myself the esoteric art of C++ from a copy of "Sams Teach Yourself Visual C++ 6 in 21 Days". So when I understood it reasonably well, I felt I must be above average. Indeed, I've never felt the feelings of "frustration and discomfort" he references. Instead, I have always had to battle with my hubris in thinking that I'm that much better than the programmers around me.
Yes? My point was simply that where he struggled with frustration, I struggled with my ego. Considering he is probably a better programmer than me, I thought this was an interesting contrast to bring up. What is the problem?
Maybe you should clarify your posting. It reads as
"Well I never had any problems learning to program. I am pretty much the smartest person in the world I guess. My "problem" is that I am so great I have too big of a head."
What is math? There are many answers, so the one I pick for this post is that math consists of starting with some basic axioms, chosen to be as simple as possible, then rigorously exploring what else you can extract from your simple axioms by concrete proofs. It is staggering what you can get from simple axioms. It is staggering the subtly with which they can interact.
What is programming? It is the art of starting with very simple primitives, then rigorously building up slightly more complicated primitives, then building another layer on top of that, until eventually you get to a level where you can do actual work. It is staggering how far we get on how few primitives; it is incredibly educational to read what opcodes a processor actually implements. (Even better, make sure you read just the modern subset.) I mean, it pretty much just has "move this here", "add this", "multiply this", "divide this", and "if this thingy is 0 jump there". Yes, I know there's a few more, but the point is that it definitely doesn't have an opcode that downloads a webpage. It is staggering the subtle ways in which these things can interact.
It is absolutely possible in both the mathematical and programming cases to do "real work" without having the understanding of things that I refer to in my previous paragraphs. A web programmer does not constantly sit and do logic proofs, an accountant does not constantly refer to number theory throughout their day. Of course this is fine for the accountant, who is not expected to do original work in the field of accounting. (It is rather discouraged, in fact.) So of course it's OK for an accountant to have a very tool-like understanding of numbers. Are you, the programmer, expected to do no original work in the field of computing, such that you don't need to understand computing deeply? It may be so. Such jobs exist. But watch out, that means you're one library acquisition away from not having a job anymore! (And if you can't be replaced by a library, you're doing original work of some kind. Most programmers are.)
Look back at my first two paragraphs, where I have obviously drawn parallels. The real value of mathematics for a programmer is not that the programmer is likely to be sitting there doing matrices all day long, or even worrying much about logic problems, and they certainly aren't going to be sitting around all day doing sums. What mathematics provides is a clean place to learn the relationships I talk about, how we build the large concepts from the small concepts, and provides a playground where you can have that famous all-but-100% certainty that mathematicians like to go on about (justifiably so).
This is great practice for programming anything beyond a trivial project, where, if you have a clue, you will probably be starting with building up some reliable primitives, and then trying to build bigger things out of them. Bad programmers just start slopping concepts together with glue and just pour on more glue when they get in trouble, and produce what can only be described as, well, big piles of glue with no underlying order. A programmer who has become skilled in mathematics has at least a chance of producing something that is not merely a big pile of glue, and can have characteristics in their program that are characteristics that a big pile of glue can't have.
It is possible to come to this understanding without passing through formal mathematics, but it is much harder, because the world of programming is ultimately the world of engineering, and it is much harder to see these patterns. They are there, but they are obscured by the dirtyness of the real world.
That the mathematics may have an independent use is gravy; even if they were somehow otherwise worthless but programming was somehow unchanged (not really possible, but go with me here for the sake of argument) it would still be a worthwhile study. There are few better ways a programmer can spend their time than to become familiar with mathematics. Without the understanding of programming I outline above, regardless of which path you take to get there, your skillset will plateau, the maximum size or complexity of a system you can build without it coming apart will top out noticeably sooner than those who do have this understanding, and there will be things that remain forever a mystery to you. (Like how those large programs really work.)
In the beginning of Hillegass's book, Cocoa Programming for Mac OS X, he has a great quote of someone from Caltech being asked about the real world usefulness of a degree in astrophysics. His response was "Not much, but if I run into a hard problem and start thinking I must be stupid because I can't figure this out, then I remember I have a degree in astrophysics so I am not stupid and this must be hard. So in that way it is useful." I'm paraphrasing (the book is buried somewhere) but that always stuck and your post reminded me how important it is to keep at it because coding is not easy, but it is worth it.
Thanks for mentioning this.
I've actually been working through that book over the past few days and think that anecdote really got under my skin. For some reason it didn't dawn on me when writing the article. I'll make note of it.
> My mother, who has the same [thermostat], diligently spent a day reading the user's manual to learn how to operate hers. She assumed the problem was with her. But I can think to myself "If someone with a PhD in computer science can't understand this thermostat, it must be badly designed."