Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'll take another tack and suggest that productivity today is more a function of what libraries might be available rather than language choice.

A language like Python has a massive number of really good libraries spanning a range of disciplines. Is it the "best" language (whatever that means)? Don't know. Don't care. You can get the job done and the ecosystem is huge.

I am not comparing Python to Haskell or anything else, just using it as an example. I don't use Haskell and wouldn't know how the libraries compare.

My only point is that when all the smoke and bullshit clears out the only thing that matters is if you can get the job done as required. If this includes a performance metric than language performance is important. If it does not, then library availability or other criteria might quickly become more significant.

From my own perspective, it would take a lot for me to choose anything other than Python for my work. When a platform mandates it --iOS native apps-- you have no choice. On embedded systems it's mostly C. When Python fits I can get shit done with it. Nothing else matters.




The graphs are not productivity vs language choice. They are productivity vs experience in a given language.

In languages with strong libraries, I'd expect a diminished benefit from experience. 10 years python experience doesn't make you much more productive in standing up a django application vs someone with 1 year of experience. Thus, the curve for python might very well be quite flat (since you start fairly high to begin with).

Lisp, on the other hand, may not start as high on the productivity scale. Because of its powerful macro system, the language can be tailored to suit your problem better than anything else.

That's what the graphs are talking about. How valuable experience is (and comparing it to how valuable we think our experience is).


Libraries can only take you so far.

A python developer with limited experience is likely to use too many 3rd party libraries in simple cases where they are really not needed. As the application grows this turns into a maintenance and/or performance nightmare.

Not to mention the fact that it takes experience to understand/troubleshoot/fix third party code. When things go wrong the dev with 10 years experience is going to be vastly more productive than one with less experience.

I am a python dev with >10 years experience and I manage numerous junior devs with limited experience, so I see this kind of thing all the time.


But are you actually gaining productivity when fixing third party code? Or are you losing productivity? I would say you are paying back the gains you got from using the library in the first place and therefore losing productivity. Granted, you generally come out ahead.

I think that's the plateau in most languages. You get to a point where you are fighting the language/ecosystem as much as it's helping you and so productivity plateaus.


[deleted]


"may not" != "does not"

I wasn't referring the the graphs anyway in regards to absolute productivity. Rather, I was referring to the parents claim that the library in python makes you more productive, which may be the case.

Also, "why wouldn't they be"? Whitespace is boring. Maximizing useful space on each graph would require a different scale for each language. Also, productivity in each space is varies widely limiting apples-to-apples comparisons across languages (even within languages this is tricky). There's no question that for some workloads Python is more productive than C++ or Haskell. The discussion is really about what the learning curve for the language is (hence the title).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: