My big eye-openers (some from postdoc) were more about the sociology of science than the day-to-day productivity:
- Even the most blatantly wrong and illogical published work can only be displaced by another publication that explains/does the same phenomenon better; i.e., people are going to keep believing in phlogiston until someone shows them oxygen. If you simply point out inconsistencies in phlogiston theory, in person or in writing, they may well make a variety of unwanted psychological deductions about you.
- Similarly, nobody actually enjoys being around critics or enduring criticism, and therefore you will observe many senior scientists partially avoiding the major downsides of being a critic by artfully concealing criticisms inside what sounds to the uninitiated like mutual affirmation sessions. You have to listen very closely and learn the lingo to pick this up.
- Never question a scientific superior (other than maybe a direct mentor or very close colleague) with any other approach besides "I have a helpful suggestion about how you can maybe reach your intended destination better/faster/more precisely". Regardless of where that destination might be, such as off a cliff or into a wall.
- The opinion/fact ratio you are allowed to have as a scientist is directly and very strongly correlated with seniority, H-index, and so on.
- The incentive structure of scientific publication is such that there are big rewards for being right on an important question, bigger the earlier you are to the party, and little to no penalties for being wrong, so long as the error cannot be provably and directly linked to fraud. There are a variety of interesting consequences to this incentive structure.
In addition to ringing true, this seems largely in line with Thomas Kuhn's thesis in his Structure of Scientific Revolutions [0], a book which despite its shortcomings, should be required reading for anyone in a STEM field.
Kuhn's thesis doesn't have a lot space for the sociology of science to have this kind of influence. Of the classical theses of scientific progress, this comes closer to Lakatos' thesis of research programs.[0][1]
[1] Lakatos, Imre. (1978) The Methodology of Scientific Research Programmes: Philosophical Papers (J. Worrall & G. Currie, Eds.). Cambridge University Press.
> The incentive structure of scientific publication is such that there are big rewards for being right on an important question, bigger the earlier you are to the party, and little to no penalties for being wrong, so long as the error cannot be provably and directly linked to fraud.
This is fantastic insight and I'd like to thank you for sharing it with our group.
Would you agree that the model of rewards for correctness and penalization only in the case of fraud is the core feature of science? And what separates it from business or politics where being an honest failure is worse than being dishonest but successful?
Again, this is a great post, and I think you have a fantastic future in the sociology of science!
I think science is too big a thing to have a small set of "core features", and the question of how to usefully define "honesty" in a scientific context is another big topic, but reading about "bullshit" (the term of art that has its own literature, not the colloquialism) is a good place to start thinking about it.
I would suggest that fraud is one of the rarest types of dishonesty, because people who are both smart and dishonest have less risky ways to proceed, and that such people are very glad fraud exists, because it misdirects attention away from their arguably more damaging and prevalent methods. Feynman has a passage about how honesty in science is more a state of mind, which I agree with. But really, the techniques to be dishonest with low risk are the same in science, journalism, politics, and business.
My field isn't sociology of science though; these are just views from the genomics trenches.
I'm joking a bit with the style and my limited experience leads me to agree with the OP.
The middle paragraph includes my sincere response that a system where discovery is rewarded, failure forgiven, and dishonesty punished is ideally suited to the mission of science.
So I was left wondering if the OP would expand on what their thoughts were about the interesting consequences.
Very early on, I noticed that graduate students tend to be idealistic, postdocs extremely cynical, and faculty ruthlessly pragmatic perhaps to the point of occasional shortsightedness. Clearly, something about this progression is expected and normal. I'm a postdoc now, so I'm right on schedule.
I think the way it ultimately works is that you have to be disillusioned from the grade-school fairy tales told to the public about how science works before you can learn to live and work in the environment that actually exists rather than the one you wish existed.
> "Never question a scientific superior?" Not parsing that concept, please elaborate.
tech < grad student < postdoc < junior faculty < full prof < Big Guy/Gal < Nobel Laureate < NIH Director
People above you in that chain will accept limited feedback on methods to attain their chosen goals and will greatly resent questions about whether their selected goals are worthwhile/realistic/rational, or whether their gestalt vision of the field's conventional wisdom is correct.
"People above you in that chain will accept limited feedback on methods to attain their chosen goals and will greatly resent questions about whether their selected goals are worthwhile/realistic/rational, or whether their gestalt vision of the field's conventional wisdom is correct."
Corporate management has the exact same situation.
Yeah, in corporate world, at least you're paid to not care, and can change jobs easily. In academia, you're paid shit, and changing labs is not nearly as easy.
I agree with most of the GP's points and I don't think of them as defeatist, but rather a call for realism when dealing with people (versus data, which have no ego to bruise). It's very hard to devise a system that rewards individual achievement without ever falling prey to classic human flaws. The good news is that science over time tends to be self-correcting, and all that requires is a commitment to shared principles and methods, combined with enough anarchy that no one individual can screw up an entire field (Trofim Lysenko being the most extreme example, but any bureaucracy can accomplish this).
The presentation is cynical, as xab31 themself attests, but I don't think it's defeatist.
1. Bad work only being displaced by good work: everything works like this. To replace some useless commercial product (take your pick) someone has to come up with something better. Same goes for information.
2. Nobody liking criticism can be rephrased as it being important to attack ideas, not people, when you have to work with those people.
3. "Never question a scientific superior" is the first piece of advise I think is too cynical. As a warning against undermining a colleague in public when you need their support, I agree, and that's kind of a restatement of #1 and #2. But science really does have a culture of publicly debating contentious ideas. You can definitely be more critical in an event specifically held as a debate / open forum than in a presentation Q&A though, and at a social event it's polite to be at least vaguely supportive.
Kind of a tangent to the later points: Day to day scientific research is mostly chasing dead ends and other activity that is (in hindsight) mostly useless, but there is genuine societal value in having a large body of skilled workers available. That is, science spends a lot of time spinning its wheels trying to figure out the right question to ask, and once this becomes clear there is rapid progress. This means the papers published in between the breakthrough periods aren't really worth paying attention to unless you work in that area. Having a lot of scientists and engineers in the workforce so we collectively have a decent chance at obtaining and exploiting next breakthrough is the point, the papers are just a byproduct.
>"Never question a scientific superior?" Not parsing that concept, please elaborate.
If you think you've been put on a bum topic or your supervisor has put you on the scientific equivalent of a PIP with no way up or out your room for maneuvering is limited, to put it politely.
I understand the impulse to not want to be defeatist, but sometimes it’s both easier and more productive to stop running into the same walls over and over and instead find the path around them.
Well, as defeatist as going to work fo a FAANG and not expecting that your managers will give a toss about fairness, their users privacy, or the spirit of regulations. Life is like this. Right now in the African Savannah a lion is mauling a gazelle, it happens daily.
This is true here, as well. I once asked about something related to SSL/TLS (fairly politely) and was kind of mockingly escorted by some groupies to the corner since I apparently responded to an Apache developer.
I was just trying to learn. Learning bad is what I learned.
The sociology of science is so interesting. (Not the field, but the subject.) Here are some of my favorites quotes/thoughts:
This one is a direct contrast to your advice (which speaks volumes about what's wrong with academia): "A good scientist, in other words, does not merely ignore conventional wisdom, but makes a special effort to break it. Scientists go looking for trouble."[0]
This was written about physics at Caltech, but applies more broadly. It explains why the ability to 'manage up' is so critical for early-career success. "[...] departments are run, for better and worse, by the professors who often lack managerial experience. Worse, they are generally unaware of this shortcoming, assuming incorrectly that management is trivially easy compared to their topics of study and merits minimal effort. We have now seen the consequences of this lack of attention." [1]
Academic politics is a great reason not to stay in academia: "Look for environments where competitors see themselves as playing a game, rather than fighting for survival — this prevents rankings within the hierarchy from becoming an existential problem." [2]
This book has a great chapter of career advice, here's a gem: "Don't build a pyramid. Everyone seems to build one pyramid per career. A pyramid is an ambitious system that one person really cares about and that winds up working well, but then just sits in the desert because nobody else cares the same way. This happens usually just after leaving graduate school." [3]
"In general, status-conscious places are miserable for everyone, and the more, the worse." [3, next page]
Gatekeeping is predictable from the incentive structure: "For all the high-level talk about how we need to plug the leaks in our STEM education pipeline, not only are we not plugging the holes, we're proud of how fast the pipeline is leaking." [4]
"So why am I not an academic? There are many factors, and starting Tarsnap is certainly one; but most of them can be summarized as 'academia is a lousy place to do novel research'." [5]
"...whereas Newton could say, 'If I have seen a little farther than others, it is because I have stood on the shoulders of giants,' I am forced to say, 'Today we stand on each other's feet.'" [6]
It is several repeated and very costly attempts that I made to do just that which leads me to give the advice I did.
The pyramid quote is an interesting one. Obviously there is a tension between being passionate about an idea/goal/cause but not being overly siloed. It seems the best-case scenario is: pick your passion, find some people who're thinking in the same general direction, and compromise the vision among yourselves.
Let's just say that the thought of solving some of the problems I'm interested in from outside academia has occurred to me. But I'm sure it's not all sunshine and rainbows on the outside, either, and moving from academia whose primary motivator is risk aversion to something like a startup is an extreme culture shock, the more so because my objective would be building something real, rather than bilking gullible VCs into an acquihire.
Well, I do aging research (mostly from a computational+biochemical perspective). I've met most/all of the important players in the field, and it baffles me how this important area of research continues to be a backwater, as far as the public's concerned.
It's hard for me personally to think of something more important than aging, so if I were to expand outwards, it would be to pursue the same goal, but maybe with fewer constraints. In general, I'd work towards streamlining and automating certain aspects of it. Technologically, the field is in the Dark Ages. There are realistically ~200-300 (max: 5000 including subordinates and techs) people in the entire world working on this seriously, which is fairly mind-boggling, considering that it is the primary risk factor for cardiovascular disease, cancer, and indeed COVID-19, along with many other diseases and the more transhumanist and futurist implications.
Young people need to realize that the things we love about science: the uncompromising search for the truth, its international and no-boundaries character, the ability to bow down to evidence, the ambition of the ideas, are just a very distilled fraction (basically the highlights) of a what it is a very mundane, fragile, political human activity, full of petty and lame characters, absurd situations and pathetic developments.
- Even the most blatantly wrong and illogical published work can only be displaced by another publication that explains/does the same phenomenon better; i.e., people are going to keep believing in phlogiston until someone shows them oxygen. If you simply point out inconsistencies in phlogiston theory, in person or in writing, they may well make a variety of unwanted psychological deductions about you.
- Similarly, nobody actually enjoys being around critics or enduring criticism, and therefore you will observe many senior scientists partially avoiding the major downsides of being a critic by artfully concealing criticisms inside what sounds to the uninitiated like mutual affirmation sessions. You have to listen very closely and learn the lingo to pick this up.
- Never question a scientific superior (other than maybe a direct mentor or very close colleague) with any other approach besides "I have a helpful suggestion about how you can maybe reach your intended destination better/faster/more precisely". Regardless of where that destination might be, such as off a cliff or into a wall.
- The opinion/fact ratio you are allowed to have as a scientist is directly and very strongly correlated with seniority, H-index, and so on.
- The incentive structure of scientific publication is such that there are big rewards for being right on an important question, bigger the earlier you are to the party, and little to no penalties for being wrong, so long as the error cannot be provably and directly linked to fraud. There are a variety of interesting consequences to this incentive structure.