A more appropriate title to the post might be "Become an expert programmer in 10 years."
We seem to focus on the poles: no coding skills vs. expert coding skills. You can teach yourself to code in less than 10 years, you just might not be an expert.
The "learn to code" debate might be more productive if we allowed for definitions of competency at the stages leading up to expert. The post does a good job of defining the characteristics of expert competency. What does it mean to be an intermediate? How many hours should you expect to invest to get there?
In most domains, the learning curve ramps up sharply for the first few years and then plains off for a longer period. For example, you might move up 80% of the learning curve in 3 years with an intense effort, but the remaining 20% of the journey might take 7 additional years ... or a life time.
Completely agreed. Code I wrote with 6 months of programming experience has saved companies millions of dollars by reducing work that required ~10 expensive people to requiring ~1 cheap person. A real developer could probably further reduce that 1 to 0, but examples like these show that even basic automation skills can contribute huge amounts of value.
I tend to call myself a 121 programmer: I try to improve the total value of my knowledge base by 1% every 21 days. That's realistic, and those 1%'s add up over time.
MacLeod Sociopaths: 121 programmers who optimize for learning.
MacLeod Clueless: 911 programmers who deal with hideous emergencies and compensate for bad management. (9-1-1 is the US emergency telephone number.)
MacLeod Losers: 501 programmers who are out the door as soon as they can go home.
An interesting implementation of the 'learn X in Y years/hours' concept is here: http://thedanplan.com.
"It’s a project in transformation. An experiment in potential and possibilities. Through 10,000 hours of “deliberate practice,” Dan, who currently has minimal golf experience, plans on becoming a professional golfer. But the plan isn’t really about golf: through this process, Dan hopes to prove to himself and others that it’s never too late to start a new pursuit in life."
After 3 years/4000 hours, he's at a 5 handicap after starting from nothing.
"Think of your many years of procrastination; how the gods have repeatedly granted you further periods of grace, of which you have taken no advantage. It is time now to realize the nature of the universe to which you belong, and of that controlling Power whose offspring you are; and to understand that your time has a limit set to it. Use it, then, to advance your enlightenment; or it will be gone, and never in your power again." (Meditations 2:4)
Isn't it odd and curious and such like that the chunks of time bandied about are units of ten? Coincidentally equal to the number of fingers on our hands. 10 years. 10,000 hours. I'd be willing to bet my granny's dentures that if we had eight fingers on each hand those numbers would be 16 years and 65536 hours (well, I mean it would still be 10 and 10,000 but you know what I mean).
Also, did you know that in many cultures (ancient Greece, Aramaic, Hebrew, Chinese, to name but some) 10,000† just means "a heck of a lot", "more than you can count". So you see, 10,000 is a bit too magical for me to take seriously.
Imprecision (one significant digit) + Benford's law (most numbers start with 1) means most numbers needing citation are of the form 10^n. http://en.wikipedia.org/wiki/Benfords_law
Interestingly, this would be true even if we had 8 fingers per hand. If we had 16 total fingers, 16 (base 10) would be written as "10" (base 16). 15 (base 10) would be a one digit number in base 16.
On the other hand, you'd rather have six fingers. Base 12 would be great because it has whole thirds! Also 12 is divisible by four numbers (2, 3, 4, and 6). 10 is divisible only by only 2 and 5, and fifths are much less common than thirds.
That's why 12 inches in a foot and 3 feet in a yard is a nice way to measure things (though the rest of the imperial system, excepting maybe Fahrenheit, has no excuse!)
Much more mathematically interesting, and tortured schoolchildren for many years, to their great benefit. Twelve pennies in a shilling, and 20 shillings to the pound sterling.
Coins were at or soon before decimalisation: Farthing (1/4d), ha'penny (1/2d), penny, threppence (3d), sixpence or tanner (6d), shilling, florin (2/), half crown (2/6), and the crown (5/). The first paper note was the ten-bob note (10/), tequivalent to 50 new pence, followed by the Pound (20/). Until 137 there had been sovereign and half-sovereign coins instead of the 10/ and pound notes.
That's why many things in the metric system are designed on a grid with edge length 60 - it gives you many possibilities for subdivisions. For example the dimensions of most furniture is based on this - 30 cm, 45 cm, 60 cm, 90 cm, 120 cm and 240 cm everywhere.
Benford's Law applies because the wider natural distributions tend to be flat in the log space. One example is the log-normal distribution, which is what you get when random variables compound multiplicatively. (When they compound additively, you get a tighter Gaussian.)
So let's say that the values in bank accounts (just for an example) are $10^(N(3.5, 2)). In the log-space, this distribution is relatively flat on [2, 5] so let's focus on [3, 4]. The flatness at the top of the bell curve means there'll be just as much mass in [3, 3.3) as in [3.7, 4); converting back into dollar amounts that means there are as many between [1000, 2000) (10^3.3 is close enough to 2000) as in [5000, 10000). So you have as many leading 1's as you do of all digits 5-9.
I prefer to call it the Benford Effect. It's not a law. You don't get it for all distributions. It's not the case for human height, and IQ's have a hyper-Benford effect (50% of leading '1's) purely on account of how the distribution is defined. You only get it when the distribution is flat in the log space over at least one order of magnitude.
Why is that a reason to not take it serious? It's an order-of-magnitude estimate, like Carl Sagan's "billions and billions" of stars, or the Bible's "40"
> Also, did you know that in many cultures (ancient Greece, Aramaic, Hebrew, Chinese, to name but some) 10,000† just means "a heck of a lot", "more than you can count". So you see, 10,000 is a bit too magical for me to take seriously.
That's the salient point. When Norvig says it takes "10 years" to be a programmer, he means it takes a heckuva long time to master the art, not that it takes 87648 hours with a 95% confidence interval. That figure is not meant to be taken at face value.
Such a rule-of-thumb needs a crucial caveat (said by others better than I'll say it here): You need N years of steadily, deliberately advancing full-time work.
There's a fundamental difference between gaining 10 years of experience, and gaining 1 year of experience 10 times.
It's harder than one might think to sustain continual real advancement over such a period; generally the employer paying the bills needs us to occasionally "take one for the team" and do scut-work for a while, simply due to normal business unpredictability; or the work dries up completely and you have a hiccup while you start somewhere new.
A source of work where opportunity continuously keeps up with ability is a precious thing.
The other caveat is that you have to be concentrating and paying attention to your performance for those 10,000 hours.
Very few people, if any, have the metacognition and executive function abilities to pay attention and steadily improve for 8 hours a day. The experts in Outliers and the research it's based upon would typically practice for only 4-5 hours/day and then rest, or do fun stuff like playing through pieces or doing a few flips or easy jumps. A bunch of anecdotal evidence from myself and my friends in Ph.D programs or professional jobs, plus RescueTime's survey of YC founders, also supports that most people can get no more than 4-5 hours of cognitively demanding work in per day. Hence the 10 years = 10,000 hours.
When you're working an 8 hour day, it's usually more like 4-5 hours a day of heavy concentration, plus 3-4 of overhead (checking e-mail, responding to e-mail, doing grunt work like unit tests that you already know how to do, talking to coworkers, sitting in meetings). If you're lucky. For many senior devs it's 8 hours of talking to people and zero coding.
(Although you can turn that into deliberate practice as well. For a while, shortly before and after I made Senior SWE at Google, I was spending the vast majority of my time answering questions from my peers and doing code reviews. I had very little time for coding and my basic coding skills improved very little, but I got really good at being able to dive into an unfamiliar codebase, identify what people were trying to accomplish, and find a way to make it happen in a not-completely-gross fashion. That skill proved invaluable later on, and basically saved the launch of one high-profile Google product.)
If you want to become an expert, you have to put in the time. Personally, all I want is to be able to throw some things together and tell good code from bad, but I'm not a programmer.
I think this is pretty applicable everywhere, though. The cycle of challenging yourself, working through it and challenging yourself again is how I learn best, at least.
I am never surprised when this article reoccurs on HN. Don't try to rush it and continue to put in your time, that is the only way to get better at the craft.
In some ways that's true - a lot of employers don't care if something is done right or done well, just done fast. Youth and enthusiasm can get you shipping quickly.
It's just that what you're shipping, and how often you have to rewrite whole chunks of the product, can be questionable.
As much as I agree with Norvig here, linking to this article in response to the "I want to learn how to code to build my startup" question is not helpful. Entrepreneurs aren't going to wait 10 years before launching their startup.
The difference, of course, is that some people just want to build an MVP, while others want to be a programmer. In that case, I'd rename this essay: "Become a Programmer in Ten Years."
You can become a programmer very quickly. But if you want to teach yourself to become a good one (something that no one else will do for you), you need to put in the time and effort.
The article isnt called 'become an entrepreneur' in ten years. I didnt see it as a response to anything. Its just a good article. One of the 5-6 that regularly get upvotes every year or two.
One thing to remember is that the author is not suggesting to learn for 10 years than you start to code.
You can learn to code in 21 days then start your startup it will be very difficult and slow. But if you are dedicated and keep at it you will get somewhere. And 10 years into it you will be an expert.
It is helpful. Probably force them to have at least one real programmer and thus save them from google indexing their private data or some other epic fail like that.
Self-teaching programmer here. Extensive IT background, and have had a CS class or two in public "B School" type university. After Codecademy, Learn x the hard way a little bit, and even command line hacking with REPL's I have been able to conquer my fear of the abstraction layer. And although I am able to look at programs that exist and I use from a programmer's mindset, I still can not understand how to model my ideas and build them out. I think with a better understanding of core CS concepts and a thorough mental whipping I will be in a better shape. My plan is to get through:
1. SICP/SICP Python
2. {Insert famous Data Structures/ Algorithms book here}
3.Find some open source project on Github of what I want to do, and extend it or even better refactor it.
From there I will be ready to feel more like a programmer. Cause I really breeze through these tutorials, but seem lost thereafter. I am not seeking prodigy status, but I know that I can become a very formidable programmer sooner or later.
I just started working with python about 3-4 weeks ago. No experience other than html, css3 and graphic design.
So I started doing some online classes but they were so boring and simple that I finished them within a day or two. I actually started doing Udacity's CS class when they teach you how to design a search engine, however half way through I realized that this can be done twice as faster with tools that already exist.
From there I read some Django docs, installed it but django was the most confusing framework to me (not really sure why, just didn't like it)
I stumbled upon Flask and I fell in love. Without any knowledge of how to program and how things work I simply started coding, working with db (postgres). Within a week or so I designed a small blogging application that lets you register users, each user can upload photos, create docs and delete them.
I try to code very neatly, and comment on everything. There are a couple of solutions for everything so if I find a better solution I comment out the first one and apply another one.
PS. Don't code for 12h straight. Go out.
If I don't know something, I seek for direction, not an answer. I also walk away from my comp and think about a solution, and these tend to be very easy and can be done with a few lines of code. I guess I use my creativity to solve code problems.
Don't forget to intersperse that with random little projects that you start because you enjoy or want to fix something that bothers you.
IMHO the programmer mindset has less to do with knowing exactly how to fix a problem than with "Hey, I bet I can fix that, and it probably wouldn't even be too hard!" (which are also famous last words, but you can ignore that :)
I feel like a noob too, mostly because I started at 18!
That's to say, I started taking programming seriously at 18. I did write a crappy Visual Basic exe when I was 8, and I dabbled a bit in Flash ActionScript in my teenage years, but I don't really count those years, because I was learning by rote then.
I really only started taking programming seriously when I was in college. Writing an RTOS from scratch was one of my biggest coding achievements. A lot of times, I wish I have a job that's more technically challenging than what I'm doing now.
I started at age 7 with BASIC on a C64. Then moved on to reading C and C++ books once I could read English. Yeah, I had to learn English before learning how to program. It makes me crack up every time I think about it. Anyhow, I had a, uh, rather interesting phase were I was into reverse-engineering programs and systems. Luckily, I did not have internet access until the end of that phase. Still remember having some fun learning stuff on IRC. By the time I was 20-ish, I was heavily into robotics. But back then the Arduino was not a reality, so it was mostly using PIC16F84 chips and 555 timers. And then I had another BASIC phase, which led me to discover Python. Then Lisp. Then Visual Basic, C#, and the .NET framework. Had too much fun with Python and Lisp. Love them both. Went back to them. I mostly do Python these days, but love writing Lisp whenever I can.
I started at 12 with C++, then I went over to 13 and learnt some Python, then I learnt Lisp at 16 and that's when I started to actually learn programming.
I know this sounds like a "lel Lisp is superior to your blub lang" but it's more of a "I actually went past procedural programming and stupid OO with Lisp and it felt like I finally actually knew something about the stuff I did".
Dunno why I'm typing this out though, oh what the heck, posting it anyway.
yep I can recall after a year or so at my first job I was considered a bit flash because I used mixed case in my FORTRAN hollerith statements just made the prompts look nice on our PDP11's
In fact having basic prompts whit told the user what the next input was was an innovation I refactored some of the senior guys code to add prompts he was a bit sniffy.
"i know what to type when " which is ok for an experiment at 3 foot scale but scaled up to 1:1 where filling the rigs tank cost 20K (about 1/2 the cost of a house at the time) making sure you didn't blow a run was more important.
Ah. This site. I came across this when I was a kid - like 5-6 years ago - I had googled the term "how to become a hacker".
This guy had opened my eyes then - I was 14 then. He still continues to do so. His links are still relevant and an amazing starting point for anyone who wants to learn.
One should probably keep visiting this page like every 6 months or so - just to check whether you are on the right track to learning.
The article makes a good point. But what's more useful is that programmers skills matrix that someone posted to HN about a year ago. It had a row for every area in comp sci, and then it had columns progrssing in order of depth of knowledge. Over the ten years you chould check off each box one by one.
Given the way we shrug people off for using a language we don't like (or not seeming enough of a polyglot), being pessimistic, not working in the past few months, or any number of other silly things, I can't see how a bias against people with less than 10 years would really hurt matters much further.
Um, hasn't this been posted and reposted already? Not trying to be a smartass, but I thought this article was already pretty popular, and generally the standard response to any beginner asking how he could learn X in 10/21 days.
regarding 10,000 hours ... from Moonwalking with Einstein
"What separates experts from the rest of us is that they tend to engage in a very directed, highly focused routine, which Ericsson has labeled “deliberate practice.” Having studied the best of the best in many different fields, he has found that top achievers tend to follow the same general pattern of development. They develop strategies for consciously keeping out of the autonomous stage while they practice by doing three things: focusing on their technique, staying goal-oriented, and getting constant and immediate feedback on their performance. In other words, they force themselves to stay in the “cognitive phase.”...
The best ice skaters spend more of their practice time trying jumps that they land less often, while lesser skaters work more on jumps they’ve already mastered. Deliberate practice, by its nature, must be hard....
Amateur musicians, for example, are more likely to spend their practice time playing music, whereas pros are more likely to work through tedious exercises or focus on specific, difficult parts of pieces...
When you want to get good at something, how you spend your time practicing is far more important than the amount of time you spend. In fact, in every domain of expertise that’s been rigorously examined, from chess to violin to basketball, studies have found that the number of years one has been doing something correlates only weakly with level of performance."
tl;dr
10,000 hours may not make you expert. Constantly increasing the level of difficulty of what you practice/work on might be more important.
I've heard it's both - that you need to constantly increase the level of difficulty of your practice for 10,000 hours. The research, after all, says "10,000 hours of deliberate practice", and deliberate practice is what you describe above.
If you have some talent and enjoy it, you can teach yourself programming to a pretty high level in a few months. But to master it, like any complex craft, takes a lifetime.
Well, this is the single most article, I personally recommend everyone to read. Mr. Norvig writes specifically about programming but this applies to every department like music, painting or photography. The way this article has spread I guess most of the programmers must have glimpsed it, or at least the good ones.
It doesn't matter if you know programming even with 10 years' experience, since people with much less will get a job by selling and people with more will not get a job if they so much as look funny.
Study how to build a personal reality distortion field, that is the ability most in demand currently
Tip: Don't focus on learning programming. Focus on learning to find and solve tough problems.
Once you go that direction you will learn things along the way. 10 years or 5 or 20 don't even matter then. Because you will be achieving the end result of why you should even be learning programming.
Though agree that those "For Dummies" books grossly underestimate the time needed to be a good programmer, I really wish he didn't use the "10,000 hours is the ultimate immutable number of hours you need to 'master' a skill." It's nonsense.
I agree with you on a certain level, since there are so many outside factors influencing how quickly you can become a good programmer. It's not as if you are an average programmer for 10,000 hours and then you magically become a great programmer in the 10,001st hour.
That being said, I don't think the estimate is totally wrong. I've been programming for five years, have just gotten through a Software Engineering degree and am comfortable using three or four languages, but there are many people around me with more experience who's development skills I am just in awe of.
The main point is that nobody can teach themselves to be a programmer (or at least a good one) in three days. To build up the amount of skills and experience required takes many years.
Not nonsense, but somewhat arbitrary. Given the present topic, maybe a power of two would be more apt, although equally arbitrary. 8,192 hours (2^13) is the power of two closest to 10^4, based on:
2^⌊log(10^4)/log(2)⌋ = 8192
(⌊x⌋ = floor(x))
It's just an anecdote. It has some connection to reality, but it's not like the fine-structure constant or the gravitational constant.
I've heard this figured bandied about quite often, as well, and I think it's mostly anecdotal and should be taken as a general rule rather than empirical fact.
I believe Malcolm Gladwell was the guy who came up with the "10,000 hour rule". [1] Norvig's article actually quotes Gladwell as the source of the 'rule'.
One of the big articles in the field is by K. Anders Ericsson and Neil Charness. It is titled "Expert Performance - Its Structure and Acquisition." You can find a copy at:
Alternately, you could read "Deliberate Practice and Acquisition of Expert Performance: An overview". It is also by Ericsson and you can find a copy here:
It's an arbitrary amount of time that relatively few people are able to dedicate to any skill. The rare people that do are known as "experts".
But you won't be as "expert" as someone with 20,000 hours. IMO it's a tautological phrase based on your chosen definition of "expert". That's why I dislike it.
Pilots' experience is often measured in hours. 10,000 is nothing special (10 years?), but to a lay person they sure know a lot about piloting, and at a dinner party I'd defer to them as an "expert" on anything to do with flying.
I think many want to learn so quickly because we need to run faster and faster just to stay in place within the technology fields....just look at all the new things we see on hacker news each day
“We would love for this to be rolled out to every high school in America,” said Carson. “But we wanted to fix a couple schools, go deep, and really understand if it works.”
He replied "Eight hours."
"Eight hours?"
"Yep, eight hours, every day. It's pretty much just like driving a bus. The hard part is keeping your ass in the seat."
Fifteen years and a master's degree later, his advice remains remarkably accurate.