Hacker News new | past | comments | ask | show | jobs | submit login
The Expert in a Year Challenge (experttabletennis.com)
150 points by MichaelAO on Feb 17, 2015 | hide | past | favorite | 64 comments



Related is The Dan Plan in which a chap is trying to become a pro golfer by dedicating 10,000 hours to it (from zero experience) [1]. I seem to recall it was after Malcolm Gladwell claimed that it takes that long to achieve mastery in a discipline. Dan's almost at the 5 year mark now. I check back a couple of times a year to see how he's getting on.

[1] http://www.thedanplan.com


Gladwell's 10,000-hour-rule is, at its core, anecdotal.

The 10,000-hour-rule seems like more of an observation that people who are really good at things are likely to have spent a lot of time doing them.

My anecdotal observation: The people I know who are masters of things, were very good at those things early on and that encouraged them to practice more and more.


I thought it was, at its core, research made by K. Anders Ericsson ("The Cambridge Handbook of Expertise and Expert Performance"), but I have to confess that I have read neither author, so could you please expand on why it is anecdotal? Thank you!

UPDATE: Wow there seems to be a lot of information in previous HN threads such as https://news.ycombinator.com/item?id=7323837 or https://news.ycombinator.com/item?id=8197102 and a response to my question in https://news.ycombinator.com/item?id=8197467


Everything Gladwell is anecdotal.


I've been checking in a bit too and am somewhat discouraged. I'm not exactly sure how he is counting his rounds towards his hour goal, but it seems like he is playing more and more and practicing less and less. He also seems very equipment obsessed. It could just be that what he writes about has changed.


It will be interesting to see how well Gladwell's claim stands up to empirical testing.


What is it you imagine Gladwell's "claim" to be?

Because he's repudiated the idea that he or any other informed researcher claimed 10,000 hours was some magic number, or that practice alone is a substitute for talent: http://www.newyorker.com/news/sporting-scene/complexity-and-...

It would have been more accurate to say "It will be interesting to see how well a gross mischaracterisation of Gladwell's claim stands up to empirical testing".


He probably imagines Gladwell's claim to be, and I'm quoting from Outliers, "In fact, researchers have settled on what they believe is the magic number for true expertise: ten thousand hours." You'll note this unequivocally contradicts the claim you just made.

That Gladwell later wrote a much, much less popular article for the New Yorker in which he walked back that totally outlandish claim doesn't justify indignation at people remembering what he wrote.


The "Malcolm Gladwell 10,000 Hour rule" Rule:

In any discussion about mastery, people who have never read Outliers will attribute Gladwell with the idea of the 10,000 Hour Rule, take it out of context, and argue with it as if it were the main point of the book, or as if it was presented as a hard and fast rule.


The truth is that the book was not very good and didn't really have much of a point.


Which, rather ironically, didn't make it an outlier of the pop psych book genre.


This is a good read on the 10,000 hour myth. I can't remember where I originally heard that the idea as pushed by Gladwell has been debunked, probably the You are not so smart podcast. I've no personal insight to say whether this is a rule or a myth but the original researcher seems to disagree with the popsci understanding of it.

http://changingthegameproject.com/the-10000-hour-myth/


A research paper authored last year cited a study which found that only 30% of the variation in performance could be attributed to deliberate practice.

http://www.sciencedirect.com/science/article/pii/S0160289613...


Very interesting, thanks


Which begs the question, what percent of the variation distinguishes the 'expert' from the non-expert?


My understanding was that the claim was that the 10,000 hours rule was a necessary but not a sufficient condition for mastery. So any world class golfer will have played for at least ~10,000 hours, but not every golfer who has played for at least ~10,000 hours will be world class. The only way the hypothesis could be falsified is if Dan becomes a world class golfer in less than ~10,000 hours.


It wasn't his claim originally; he took a paper and summarized it in the 10,000 hour rule, even the original paper author has denied the claim stated as such.


There is a better summary of the studies in "Thinking, Fast and Slow"

From what I remember, the claim is that 10,000 hours of deliberate practice work best when:

1. You get immediate feedback on success and failure

2. The thing you are trying to master has a skill set that can be improved, and that improving that skill is sufficient.

That #2 is tricky -- a counter example is stock picking. We have no evidence that there is a skill that can be learned that will make you better at picking stocks, so 10,000 hours of deliberate practice, even with immediate feedback, won't help.

I think that Dan is following a reasonable deliberate practice plan with experts and that he does get fast feedback. What we don't know is if Golf is something that is primarily/mostly/all skill based. For example, what if there was a physical body type that made you better and Dan didn't have it -- then 10,000 hours might not help.

Also, the point of the 10,000 hours is to train your intuitive mind (System 1, the fast thinker) to be as correct as your deliberate mind would be (System 2, the slow thinker) if you had time and attention. This whole thing is much better for thinking-based activities, like say, programming. I don't remember if the studies looked at physical activities and "muscle-memory".

Caveat: this is all from memory -- read "Thinking, Fast and Slow" if you have interest in this.


I read Thinking Fast and Slow and also I also read a better summary of the initial study, by the author I think. The 10,000 hour was taking from a study from top music performers (so, no random population), also there was a lot of variance; for ex among chess players some achieved mastery in 5,000 hours, others put in more than 10,000 and didn't achieve it. So lots of deliberate practice is good, arbitrary 10k hours is BS.


There is a skill that makes stock picking work, Warren Buffett & Benjamin Graham have written all about it.


Could you point to some articles? I've read a few Buffett articles and his suggestion seems to generally be - invest in an index fund and leave it alone.


Warren Buffet's own letter to stockholders is the best source for what he thinks. It is not stock picking (he parks some money in stocks, but that's not the bulk of what he does). He calls what he does "capital allocation" not "stock-picking"

His basic strategy:

1. Write insurance: that gives you tons of cash up front. Get good at setting the price right so you can make money on the float (using money in near term, paying it back in the future)

2. Now with all of this cash, invest in things (like utilities) that need big cash investment, but then pay-back forever (basic rent collecting). He likes things with big barriers to entry (moats)

He does not care if his company stock goes up or down, all he cares about is intrinsic value: the cash generated (now and in the future) and time value of money. His benchmark is the S&P 500 -- he wants to beat their intrinsic value.

But, read the letters -- they are awesome: http://www.berkshirehathaway.com/letters/letters.html


Very interesting, thanks. Had a skim of the most recent one, will find some time to read them in more detail.


“I fear not the man who has practiced 10,000 kicks once, but I fear the man who had practiced one kick 10,000 times.” - Bruce Lee

I think that attitude is how you get 'expert'. In the military for example, you train your muscle memory such that in combat, when your adrenaline is at an all time high (no drug touches it IMO) your body just tends to do what it needs to do because you have trained it so. This also ties in with Bruce Lee's view on forms. You have to learn the form, and practice it, until you forget it... if that makes sense.

The bottom line is that practice makes perfect. In todays world though, with athletes who have the ability to do nothing but train, I think it's harder and harder for a layman to compete. When I was a kid I went to the national junior olympics without even trying. If I that same kid today, I doubt I could make it past state.


Always sad to see this myth live on. It is simultaneously as self-evident as it is nonsensical.

1) Yes you get better with practice 2) Some faster than others 3) Some with a higher peak than others 4) It is neither necessary, nor sufficient with 10 000 hours effective practice (some win the world championship in high jump after less than a year [Donald Thomas 2007], others will never become chess champions, or even ranked among the top 10 000, after 30 000 hours' dedicated practice) 5) Even the original study just put 10k hours as the AVERAGE practice time for THOSE few that eventually became masters (i.e. were super-talented), with a range of 3k-30k hours needed.

However, always good to see people wanting to expand their skill set through dedicated work - admirable

(but has NOTHING to do with the fundamentally flawed 10k hr roule)


10,000 hours has got to be the least interesting concept in the body of work around deliberate practice. But it is somehow the concept that is most widely known.

I would love to know the psychology behind why that concept was so sticky given that it seems self-evidently worthless to know where the theoretical cap on self-improvement lies. I mean, how many things do we put 10k hours of practice in to?

The much better concepts from this world are:

* The deliberate part of deliberate practice. There are very different qualities of practice. Tim Ferriss is making a living on hunting down concepts around minimum effective doses of practice and his third book (Chef) is very good for this. These are all essentially finding more effective ways to practice so that you get more out of each hour.

* The experienced non-expert as an explanation (and pejorative) for everyone who has 10,000 hours of experience but isn't very good.

* Difficulty: the ideal practice difficulty is uncomfortable, falling between trivial and demoralizing.


The 10k hour concept is sticky/durable because it holds out hope.

The hope that you could be one of the best in the world if you just worked hard enough.

The hope that it's not too late.


Further, the reason the concept of 10,000 hrs is sticky/durable is because it is:

1. Actionable - As demonstrated in the blog post, there is a step-by-step set of actions to be carried out.

2. Authentic/Believable - No one doubts (without serious consideration) that any expert hasn't spent 10,000 hours on their craft.

3. Concrete/Measurable - It's a very specific amount.

4. Relevant - It's known that it takes a lot of work to become an expert and who doesn't want to become an expert?

5. Simple


It's also politically correct and fits in with the current generation's thinking of "you can be anything you want." This comes up now and again on HN as well about whether everyone should/can learn to program.

Will practice make someone better than the average person? Definitely. But to be the best in the world, you have to have some genetic advantage - be it your body composition or the way your mind works.

I always like to point out the ice hockey player Ed Jovanovski. He didn't start playing hockey until he was 11 (which is pretty late) but was drafted 1st overall into the NHL just 7 years later. However, his dad was a semi-professional soccer player so his genes probably gave him an advantage over others who started playing much earlier but didn't make it.


There is also the interesting case of people with multiple areas of expertise. This is an understudied phenomenon, but I think it can shed light on the issue.

There are some skills that just come more easily to a person, and this is demonstrated by the case where you have the same person doing very similar things for similar lengths of time but in one case they become legitimately expert and in another they get better, but fall well short of true expertise. I'm a brilliant poet (if I do say so myself) but while I've invested at least as much time in prose I know I'll never be as good.

This doesn't mean practice isn't important, but it does mean that talent matters, both for how fast you improve and the ultimate capability you reach.


Where in this project do you see any myths being propagated? I followed the project all last year, and it is extremely reasonable and not in any way based on any myth I'm aware of.


The key to greatness is not making the same mistake twice.


this goes double (or half?) for test pilots


One of the most interesting things I've read (or listened to) about expertise is a podcast interview with the author of The Sports Gene[0].

> So in my second chapter I sort of tell a story I call the 'Tale of Two High Jumpers,' in which I sort of profile two high jumpers, one named Stefan Holm, who over the course of 20 years--and by his estimate, 20,000 hours--made incremental progress every year, to the point where he became one of the best high jumpers in the world. And then in 2007 he travels to the World Championships, and is met by a total unknown, a Bahamian jumper by the name of Donald Thomas, who just recently had started high jumping, basically because he had found out he was good at it on a lunchtime bet at his college. And so, Donald Thomas is closer to 0 hours, and Stefan Holm is closer to 20,000 hours. So they average 10,000 hours. But Donald Thomas actually wins that competition. And so I thought it was a good story to explain the fact that not only is there huge variation, but different athletes can get to the same place with both different biology and different training programs.

[0] http://www.econtalk.org/archives/2013/09/david_epstein_o.htm...


Reminds me of Moonwalking with Einstein.

https://en.wikipedia.org/wiki/Moonwalking_With_Einstein


There's a writer from Esquire who's basically made a living doing experiments like this. He's written things like The Year Of Living Biblically [1] (self-explanatory), The Know-It-All (he read all of Encyclopedia Britannica) [2], and other experiments.

There are a lot of others who have done similar things. Matt Cutts from Google does 30 day experiments. Karen X Cheng learned to dance in a year. There's something about contained experiments that are quite compelling.

I'm doing my own version of this, as a writer. I'm working on writing 1,000,000 words [3]. I'm doing this in 1,000 sets of 1,000. I'm currently at 240.

760 to go.

[1] http://ajjacobs.com/books/the-year-of-living-biblically/

[2] http://ajjacobs.com/books/the-know-it-all/

[3] http://visakanv.com/1000/


Definitely, an inspiring story.

Comparing "how much of an expert" you can become after a full year of good training in different activities would be a great experiment.

Anyone dares to create a ranking with some? (Say tennis, chess, web development, ice hockey, ...)


I think this has a lot more to do with people’s expectations than inherent difficulty. EX: Becoming a chess expert takes what?


The US Chess Federation title directly below Master is Expert. It'd be a bit more nebulous in other countries.


And of course, there's Dance in a Year

http://danceinayear.com/


One public dataset that I think people should analyze is how competitive rating changes based on number of events you participated in.

For example, in programming competitions the history ratings of everybody is public: http://community.topcoder.com/tc?module=AlgoRank.

One of the most well known competitive programmers is probably tourist: http://community.topcoder.com/tc?module=MemberProfile&cr=222.... He started at a rating of about 1200 when he was 12yrs old but took him over 5 years to increase it by 2000 points. For everyone that kept at it for equally as long, you can see a similar trend in trajectory. The only difference is the slope at which their ratings increase.

(The ratings probably mean nothing to you, so for some calibration, people with a rating of about 1500 can easily pass a google interview, and I think a fresh CS grad with no practice will probably start at around 1000. Which probably gives hope to the average CS grad since even for the world's best, it took them about 10 years to get to where they are.)

I just thought this was a nice way to quantify growth vs practice.



I believe the number of hours required to become an expert depends on both the nature of the field and the competition in it.

Although there are a lot of table tennis players in Britain, probably not too many devote significant effort and time to master it. Thus, the effort required to be in the top echelon is not that great. The same cannot be said for, say, American Football in the US or soccer in many countries.

The 500 hours or so he spent on table tennis is woefully inadequate to master mathematics as well, but for a different reason. (Mastering math requires knowledge and skills on a huge amount of materials accumulated over centuries.)


As is typical in most threads about this perennial topic here on HN, the people who are most relaxed about putting in their hours of practice are the people who are most relaxed about their definition of expertise. To be indisputably an expert (to have, in K. Anders Ericsson's formulation, statistically reliable superior performance) is not the work of a moment, no matter how much "innate" ability one starts out with.


I love the idea of becoming an expert at something in an unusually short period of time. I felt like this article was entertaining and enjoyable.

That being said, in regards to the 10,000 hours discussion being had here, I feel like the issue can be summed up as being a classic confusion of correlation and causation. It is obvious that a master will have practiced more than the apprentice but the practice itself is not the sole source of mastery.


I have a tennis version of this. EVery day makes a huge difference.


You do? Can you share a little bit of your experience. I just did something similar with tennis last summer and measured my improvement using the babolat play stats. If you get the chance, check out Jeff Salzenstein. I improved and learned more watching his videos and courses than I did in the last 7 years. Just for example, from the play stats I was hitting 1 out of 4 forehands in the center of my racket before I knew about Jeff Salzenstein, and in less than 1 month I was at 8 out 10 in the center. Would like to hear about your experience though.


Reminds me "Balls of Fury" movie

http://www.imdb.com/title/tt0424823/


So flawed I don't know where to begin.


[deleted]


What's your issue with the word "expert"? The guy was trying to land in the top 250 in a year. That would comfortably place him in the "expert" player category (had he succeeded).


I think expert here is dependent on different people's perspective.

My credibility: I had over a 2000+ USATT rating 8 years ago, was #1 under 18 in Texas, top 10 under 18 in USA. I trained in China (WuHan and Shanghai) for 2 summers at my peak. My training buddy and archrival is currently the USA national champion - Timothy Wang, originally from Houston who quit highschool to pursue the sport professionally in Europe.

The kid from my perspective is still an enthusiast (my perspective). Granted, he's playing tournaments and learning but there's another huge gap between where he is now and an "expert" or "pro" in the US.

There's yet another HUGE gap between the #1 in the USA and top 250 or "expert" in the world. Timothy went to the Olympics in 2008 and lost in his first match to a 14 year old from Korea: 4-0. The "expert" skill and experience difference is vastly different and is just a moniker that is being used in his marketing. =)

To get to where he is at in a year is commendable and definitely takes dedication - to get to an "expert" simply takes time and playing hundreds if not thousands of different players at tournaments, leagues, etc...

My father is the #1 senior TT player in the US based on the last senior olympics open in 2014. He is in his mid 60's and still poses a commendable challenge to the top 250 players in the US simply due to his play style, experience, and racquet/paddle (pips out and pen hold) - not saying he always wins the match, but he can typically take a couple games out of 7. Experts/pros adjust much quicker.

Anyhoo, table tennis is a great sport with little to no risk of getting a life threatening injury. It is relatively cheap and provides an amazing workout when played at the "enthusiast" level.

Love that he's bringing more awareness to the sport. It is under appreciated and typically exists in the Forest Gump/Balls of Fury/frat houses of the world.


I totally agree that this guy did not actually reach expert status. You're correct in labeling him an enthusiast. I think, though, if he'd hit his goal of ranking in the top 250 for England, there'd be a compelling argument for considering him an expert.

At some point, "expert" status necessarily comes down to a subjective and somewhat arbitrary line between players that are very close in skill. Are the top 10 experts? Top 100? Top 1000? If you're comparing to the average Joe, all of these could probably be considered experts. If you're only considering the elite of the elite, then maybe only the top 10 get that classification.

If you look at the NFL, there are almost 1700 players. Obviously these guys cannot all fit into even the top 1000 ranking. Yet all of them are generally considered to be elite players (or they wouldn't retain employment), and they're all being paid over $400K/year for their skills. Arguably, all of these guys are experts at what they do.

Or look at doctors. There are about 300 cardiothoracic surgeons in the US and Canada. You could rank these doctors and come up with a list and arbitrarily say that the top ten are the experts in cardiac surgery. But they are all experts, relative to not only the general population, but to the population of practicing physicians.


"I totally agree that this guy did not actually reach expert status. You're correct in labeling him an enthusiast."

You are such a hypocrite. If you are 250 in England it does not give you a compelling argument to consider yourself an expert at this SPORT unless there are 250 players from England, who are ranked in ITTF.

I guarantee you that a person, who is in top 50 in England is not even close in skill to the one, who is 250. In any sport there are "experts", who can teach the skill very well. They might not even be really good at this sport. Tons of amazing football coaches haven't been that good at football.

Expert in playing (competing) in any sport is the one who can prove not with his ranking but through his match averages against top 50, top 100, top 200 players in a certain sport. As I already mentioned, ranking system is completely flawed, especially for the bottom tier players.

Seriously, you are talking about doctors? Doctors receive very similar in the US (standardized) training and they don't spent 1 year to be an "expert". How about 12 years for cardio surgery? And, yes, all doctors are experts at what they do.


In what way am I a hypocrite? Either you're confused about the meaning of that word or you're just throwing out insults now.

You need to let go of the belief that you own the definition of "expert". Firstly, you've been unable to even define what "expert" means to you. Secondly, others clearly don't agree with you on what "expert" means. There is no universal standard for what constitutes an expert. It's a subjective call. If you think that only the top 50 people worldwide are experts in table tennis, I guess that's fine, but you should perhaps be a bit less hostile about it.

I do find it odd that you say all doctors are experts. This is a strangely liberal application of the term "expert" given that you're so stingy with the term as applied to table tennis. There are 10 thousand medical experts in England but not 250 table tennis experts? It must be damned hard to become a table tennis expert.


I agree with your sentiment.

Do we judge an "expert" on their ranking or on their knowledge and ability? If 50,000 people excelled to the level of the current #250, would only that best player be an expert, or could all 50,000 be considered to have excellent domain knowledge and ability? I'd call all of them experts.

Expert and "highly ranked" or "best" don't necessarily need to be the same thing.

Many of us get termed software experts or design experts or IT experts without any ranking systems.


My issues is that top 250 in England doesn't make you an expert of table tennis. It makes ok at table tennis. It might give a privilege to say that I am an expert at table tennis in England but definitely not within the sport itself.

Ben Larcombe, the author of this wonderful piece, is not an expert at table tennis either. Can he teach someone to be an expert? Maybe, he can, but as a player he is not.

"My England ranking varies from about 150th – 200th depending on how I’m getting on at tournaments. My long-term goal is to get myself into the top 100 players in England."

This quote means that he claims to be an expert at table tennis because reaching top 250 is an "expert" level. Out of 1000 people who are ranked.


What constitutes an expert in your opinion? There are some 2.4 million people playing table tennis in the UK[1], so the top 250 is pretty elite. Just looking at membership in Table Tennis England, there are nearly 25 thousand players[2], making 250 the top 1% of people who care enough about the sport to spend money becoming a member. I'd call that expert. It's certainly not just "ok at table tennis".

[1] http://tabletennisengland.co.uk/news/table-tennis-facts/

[2] http://tabletennisengland.co.uk/etta_website/annual_report/a...


I play table tennis so I guess it constitutes me as a table tennis player. In order to become a member of England Table Tennis Association you just have to pay a fee. They don't check whether you even can hold a racquet or not. So out of people who pay the fee it makes you "elite". But out of 1000 people who are ranked it makes you average even in England.

Don't get me wrong. I am all for these experiments and finding out that you don't have to work on a skill for 10,000 hours to be an expert, but, please, don't say that 250 in England is an expert at table tennis. It is just a lie.


If you're in the top 250 out of 1000, that doesn't make you average. By definition it's well above average. Moreover, the 1000 ranked players do not represent the continuum of players overall. They are all significantly better than the actual average, which is why they are ranked.


Great spin off on the word average here. It makes you average at table tennis. But we are talking about being an "expert". So 250 in England does not make you an expert at table tennis. Also if you knew what you were talking about then you would know that if I want to get ranking ( just for the ranking sake) I would craft my schedule in a way so I play weak tournaments just to get ranking. Ranking and expertise correlation is very flawed. Ask any athlete about this and they will tell you that there are a lot of players that would play weak tournaments just to get ranking. They would be even top 50 in the country but it doesn't make them experts.


At this point, it would probably be helpful if you told us your definition of the word "expert".


Please, see taylorhou response.


> I think expert here is dependent on different people's perspective.

Great - so you agree that to many people he would be considered an expert. And so, to all but the elite players, he is an expert.


They may not call themselves an expert but I'm sure they'd say they're a professional. They're just using a synonym.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: