Hacker News new | past | comments | ask | show | jobs | submit login
What is ‘elite overproduction’? (overcomingbias.com)
288 points by cinquemb on Aug 19, 2021 | hide | past | favorite | 351 comments



While the premise is interesting, I really do not buy some of the contemporary examples given at the end of the post. In particular the one about physics exams. It sounds a lot like "kids these days" fallacy which has been the standard complaint of each adult generation since the start of history https://www.goodreads.com/quotes/63219-the-children-now-love...

I teach university physics. Intro classes are easier and the exams in these classes are meant to test whether you have learnt physics, not uncover "gifted geniuses". Once you have shown interest in the early classes, you can decide to go into the more challenging classes, whether the classical "problem based" ones celebrated in this blog post, or the more recent "project based" ones which are just as challenging, just a different style of challenging. Which is good, we should encourage all types of genius, not only the masters of algebra. However, yes, we should avoid belittling the masters of algebra in these conversations as well, the same way we learnt to stop belittling "nerds" and "geeks".

For what is worth, the bad "problem based" classes that the author bemoans, have always existed.

Edit: A sibling comment explained that the Socrates quote I used is probably wrong. https://quoteinvestigator.com/2010/05/01/misbehave/


I saw it more as "teachers these days" than "kids these days".

There are several approaches to exams and grading. Testing what the students have learned is a common one. In that approach, the highest grade is the expected one. You earn it if you have learned everything you are supposed to, or at least 90-95% of it. Lower grades are signs of failure.

In another approach, the exam is supposed to challenge the students. When I was undergrad, you only needed 80-85% of available points for the highest grade. Even the best students were expected to fail some assignments, because they would then learn from their mistakes. It was nice when it worked, but sometimes the problems just too difficult for no reason or required an insight nobody would ever find, expect by accident.

One grading approach is open-ended from the top and more common with theses than classes. If your work is everything one would expect, you get the second-highest or third-highest grade. Highest grades are only available for those who demonstrate understanding well beyond the scope of the class or the degree.


> When I was undergrad, you only needed 80-85% of available points for the highest grade.

In France you only need half of the points. When I was in college, you had to have at least a mean of 10/20 (we grade on 20 points), and have no course below 8/20. The best students were usually around 16, 17. The worst were around 5. Most of the students were between 10 and 12. I like that system because it gives a vast field to grade people, and most people can always improve. You can also compensate some courses with others, but only to a point.


> Even the best students were expected to fail some assignments, because they would then learn from their mistakes.

Meh, if your goal is for students to learn from mistakes, you can give them those any time during semester. No reason to put them on test. The best students will try to crack them, in fact, will spend more time that test allows on them. The difficult questions, especially in math or computer science, may take multiple days and attempts to solve. Going through that frustration is definitely learning experience. Putting time hour limit on it during test just weakens it.

> Highest grades are only available for those who demonstrate understanding well beyond the scope of the class or the degree.

In that point, it becomes in scope. Because "beyond scope" is infinite you have people guessing what "beyond scope" they should learn to gain it. The tests and exams should not be game.

Let it be know what are you going to test in advance.


The first exam type tests what you are expected to have learned. A good student can assume that they can solve every problem and get a close to 100% score. Each problem must therefore have a small solution that is easy to find without any special insights. That requirement alone limits the applicability of such exams to confirming that the student has a sufficient understanding of the topic to continue to more advanced topics.

In the second exam type, even a good student can't assume they can solve every problem within the time limit. They have to prioritize the problems based on how easy they expect them to be. The exam tests not only what the student has learned but also how well they understand what they can do and what they can't. The same can also apply to other types of assignments. For example, even a good student can't assume that they can solve every homework problem before the deadline.

The third grading approach is usually used for most important works. Under this scheme, the highest grades are reserved to those who do something exceptional. No matter what the expectations are, if you only meet them 100%, it's not sufficient for the highest grade.

As an example of the third approach, when I studied CS, our department gave the highest grade to only around 1 thesis in 100. There were no set criteria for the highest grade, except that the faculty had to agree that the work was beyond what one would expect from an excellent thesis. It was something not even a smart determined student could expect to gain, except by accident. Even the theses that got the highest grade were rather unexceptional in the grand scheme of things. Except the one that described an OS that runs on billions of devices today. That's the proper bar an undergraduate student should aim for if they want to guarantee the highest grade under this grading scheme.


> The first exam type tests what you are expected to have learned. [...] Each problem must therefore have a small solution that is easy to find without any special insights. That requirement alone limits the applicability of such exams to confirming that the student has a sufficient understanding of the topic to continue to more advanced topics.

This untrue implication. You can have difficult exercises that do require special insights with long solutions and still keep it within curriculum or "what you was expected to learn". What you cant do is to give them exercises that require one more chapter or invention of difficult to prove theorems that were not taught.


You are absolutely right! The only thing I would add is that it is important to choose the appropriate type of exams for intro classes (the ones that verify understanding and hopefully intellectually entertain but do not excessively challenge) and for advanced classes (the ones that might even have unsolvable problems, there is an established understanding that a 100% is not expected, and that provide for a great sense of intellectual challenge (and achievement)).


>sometimes the problems just too difficult for no reason or required an insight nobody would ever find, expect by accident

Sure we can learn some techniques and the way we approach things may let us tackle problems in a way that is more likely to work.

And yet I find it baffling that sometimes formal education wants a student to reach by themselves a conclusion that the brightest minds of mankind had a hard time getting there by themselves.

While spending a fraction of the time thinking about the problem, of course.

If you are meant to reproduce the way to the solution (having studied the method already), that's one thing. But if you are asking some student to come up with some bright idea that people spent years without being able to get it? Now that seems a little too much.


> There are several approaches to exams and grading. Testing what the students have learned is a common one. In that approach, the highest grade is the expected one. You earn it if you have learned everything you are supposed to, or at least 90-95% of it. Lower grades are signs of failure.

> In another approach, the exam is supposed to challenge the students. When I was undergrad, you only needed 80-85% of available points for the highest grade. Even the best students were expected to fail some assignments, because they would then learn from their mistakes. It was nice when it worked, but sometimes the problems just too difficult for no reason or required an insight nobody would ever find, expect by accident.

On a more holistic level, the second approach seems better in some ways. Except for exceptionally talented students, it removes the ability to achieve "perfection," so good students would become a little more resilient because their exposure to failure.


>One grading approach is open-ended from the top and more common >with theses than classes. If your work is everything one would >expect, you get the second-highest or third-highest grade. Highest

the problem with this, in the US at least, is that the STEM people might do this, but then the social science and humanities departments were using the other grading scheme. It got to be running joke that 90% of the social science students earned latin honors, while in the business school it was a small fraction of that and in the science school it was even smaller.

(which again brings us back to the thread topic I guess, when graduating with latin honors was hard it was a good way to indicate likely entry into the elite. Now that many can achieve latin honors they are no longer good for entry into the elite, but people still expect them to be.


> It got to be running joke that 90% of the social science students earned latin honors, while in the business school it was a small fraction of that and in the science school it was even smaller.

Grade inflation in a nutshell.


> In another approach, the exam is supposed to challenge the students. When I was undergrad, you only needed 80-85% of available points for the highest grade.

I've seen a system where they grade on a curve, A is 4.0. But there's A*, which still gives a 4.0 on the transcript, but denotes that a student (or a group) were statistically far enough from those who got an A that they deserved special recognition (ie, the threshold for an A is 85% but someone managed to score a 98%).


How does getting a grade in an exam demonstrate anything other than the ability to regurgitate information? It doesn't display understanding along any axis of intrinsic knowledge.

Education needs to take a step back and reexamine what knowledge and understanding means. Right now it's basically a factory for memorization.


How does getting a grade in an exam demonstrate anything other than the ability to regurgitate information?

Just about every exam I had at university involved solving/proving at least one novel problem that we'd never seen before. For example we might have seen the proof of a statement under certain conditions, but on the exam you where asked if it was also true for a slight different condition. That isn't something you could do just through memorizing the proofs you'd been shown in class.

Memorizing and 'regurgitating' would most of the time get you a passing grade, but never a top grade.


That is true about bad exams like the SAT and the GRE. And to be fair, even good exams (text problems that take the student on an adventure through deriving a curious phenomenon, for instance) suffer as their results are confounded with whether you deal well with stress. However, it has long been trivial to make good exams that are explicitly not about memorization: take-home exams, open-book exams, exams in which collaboration is encouraged, project-based exams, some forms of oral exams, and more.


Almost all of my engineering exams were open book, and the ones in EE sometimes had difficulty curves such that a 50% score would get you top marks.

It worked pretty well

I still have stress nightmares about op amps though


>It sounds a lot like "kids these days" fallacy which has been the standard complaint of each adult generation since the start of history

First, pedantically speaking it's not a fallacy (as in "logical fallacy"). It's just a statement that can be right or wrong.

And this is regardless of whether it has been repeated "since the start of history". It could still:

(a) be always true (everytime it has been repeated)

(b) be always false (everytime it has been repeated)

(c) be sometimes true and sometimes true

There's no logical necessity that it has to be (b) just because it's a "standard complaint".

In fact, given what we know about periods of acme and decline, and also for periods of stability, it's obviously (c). Sometimes the complain is correct.

So, the question is whether in this case, for this period, it has been true or false -- not a facile dismissal with the non-argument that "people has been saying that since forever, so it must always be false".


When successive generations complain about "kids these days", there is a contradiction. Gen 1 says "we were great as kids, but kids these days are terrible". Gen 2 claims they were great as kids and it's Gen 3 that's terrible. Gen 1 and Gen 2 are contradicting each other. Was Gen 2 good or bad as kids? Depends on if you ask them or their elders.

It's possible that there is a constant decrease in standards, such that every generation is worse than the one before it. But I doubt it. If such a steady decline had been happening since the 8th century BC civilization ought to have collapsed several times over. Tellingly, no one ever says "kids these days are much better behaved than we were as kids", even when it's empirically true. For example, kids these days in the UK drink less than their parents did at that age, but they don't get any credit for it.

What's actually happening is likely sampling bias, where the person making the pronouncement is looking at a few bad apples as well as nostalgia biasing their recollection of their childhood.


"It's possible that there is a constant decrease in standards, such that every generation is worse than the one before it. But I doubt it."

On the contrary, that has been my interpretation of these signals since I first started hearing them.

All living things (cats, societies, lizards, civilizations) are born, have an adolescence of growth and development, then decay and die.

So there must be some brief period in the adolescence of a civilization where "kids these days" cannot be applied. Society is growing, developing and integrating.

After that brief period it may then, indeed, be possible for each successive generation to correctly identify and decry the decay of their (society/civilization).


> When successive generations complain about "kids these days", there is a contradiction

It's not a contradiction, if each successive generation is more terrible than the one that came beforehand.


> It's not a contradiction, if each successive generation is more terrible than the one that came beforehand.

The OP considered this though, to quote: It's possible that there is a constant decrease in standards, such that every generation is worse than the one before it. But I doubt it. If such a steady decline had been happening since the 8th century BC civilization ought to have collapsed several times over.


> But I doubt it.

A contradiction leaves no doubt. It's not a contradiction then. A contradiction is not just something that is unlikely or infeasible, it is something impossible. It means that your set of beliefs is flawed and that at least one premise needs to be corrected for them to become consistent.

OldSecondHand is correct that it is possible for each generation to have been worse than the preceeding one. For example, the degradation might be so minute, or we may have started from such lofty heights, or the society has become more resilient to the degradation. It is possible, and therefore no contradiction. ...just unlikely.


> civilization ought to have collapsed several times over.

Which it did, in some ways.


>civilization ought to have collapsed several times over.

uhm, didn't it?


To the point where we abandoned civilisation and became Hunter gatherers en masse? No. There have been ups and downs for individual kingdoms and empires but civilisation as a whole did alright since the 8th century BC, Socrates’ concerns about kids these days notwithstanding.


>To the point where we abandoned civilisation and became Hunter gatherers en masse

To the point of the fall or Athens and Greek empires, the fall of the hellenistic kingdoms, the fall of Rome, all the way to WWI and WWII, and that's just confined to the Western side of the world...


The fall of Rome was not "the collapse of civilization" and the Roman empire as a political entity never quite ceased existing seeing as how it continued in one form or another all the way up to the modern day by virtue of Istanbul still existing.


Yeah. A few times before the 8th century BC aswell.


>When successive generations complain about "kids these days", there is a contradiction.

Not really. It's totally logically consistent if things are going downhill with each successive generation. (In fact, it would still be consistent if different things were going downhill, Gen 2 worse on X compared to Gen 1, and Gen 3 worse on Y compared to Gen 2, and so on). Whether that's the case it's another matter.

>It's possible that there is a constant decrease in standards, such that every generation is worse than the one before it.

Exactly.

>But I doubt it.

I wouldn't be so sure.

Note also that since there are no permanent eternal standards standing outside of history, each generation only has to be worse by the standards of its previous generation, for the accusation to be true...


>kids these days in the UK drink less than their parents did at that age, but they don't get any credit for it

Illegal narcotics would like a word.


> When successive generations complain about "kids these days", there is a contradiction.

There is absolutely no contradiction, because in fact they are complaining about different things. Sometimes they are complaining about kids being more violent then they used to be. Other times they are complaining about kids being more rude, more passive, more drinking. Then they complain about kids dont more sport, less sport, reading too much, not reading enough.

If you abstract it away to "kids are bad", then yes, it sounds as if the complain was constantly the same. But that abstraction is manipulative.

> Tellingly, no one ever says "kids these days are much better behaved than we were as kids", even when it's empirically true.

I have in fact said that multiple times. The "no one ever says" part is not true either.

Yes, people are more likely to comment on negatives that affect them then on positives. That is true about literally anything, including software updates. And that still does not imply that every complain about bad software update should be shut down.


"It's not a contradiction, if each successive generation is more terrible than the one that came beforehand."

If ancient Mesopotamians were the pinnacle of civilisation, and we were degrading for a hundred generations, we ought to be living in caves by now. It would be measurable in wealth, IQ, rates of murder, crime, literacy, etc.

Every statistics avaliable indicates that we used to be more ignorant, more violent and less literate. So this complain belong in the same box as:

"back when I was young we used to talk to school 12 miles, through a blizzard, uphill, both ways!"


You are making a few mistakes:

* technology accretes even if the quality of the average worker is declining across time. Thus tomorrow will still be better than today, at least from the point of view of things like technical advancement, regardless of what is happening within each person.

* population increases, so the smartest can still be smarter even if the average is declining

* Even if population is not increasing, improvements in communication and replication technology can mean that an increasingly smaller share of the brightest people anywhere in the world can design products that are replicated or simply assembled by masses of others, whereas in the past this required local designers. Think of how many people are truly needed to keep advances in jet engines happening. You have basically three companies that supply all the engines to the entire world.

* And one can even make up for the other -- e.g. in previous times all shoes were handcrafted, now they are mass produced. Thus a lower skilled person in a later period can still outproduce a higher skilled person in a previous period.

Now I'm not saying I necessarily buy the decline thesis. It's an intriguing thesis, basically that you can be pretty dumb and still thrive well enough to keep reproducing whereas in hunter gatherer societies you would have starved to death and so the population would on average be smarter due to the challenges they had to overcome. Maybe this is true, maybe it's not, but the objection that knowledge accretes does nothing to prove or disprove this hypothesis.


"technology accretes even if the quality of the average worker is declining across time."

I cannot fathom how you possibly write this - surely you cannot be claiming that dumber and dumber workers develop better and better technology, or technology is some independent quantity that develops without human intervention, like a divine gift of some sort?

"a lower skilled person in a later period can still outproduce a higher skilled person in a previous period."

The last bloke operating the machine doesn't produce anything on his own, you have to compare the whole value chain, including the folks making the leather, making the tools and machines, etc. I am sure the old process includes plenty of people doing unskilled groundwork, although I would not bet my house on which way the dice would fall


>If ancient Mesopotamians were the pinnacle of civilisation, and we were degrading for a hundred generations, we ought to be living in caves by now.

That doesn't follow.

Civilization != technology.

And those previous generations weren't lamenting "kids these days have less technology than us", but "kids these days are worse in X, Y, Z" regarding ways of living.


So kids are worse in some 'X Y, Z', but in any of hundreds of statistics we collect they are better. Except the fact that younger generation has less sex, but that seems to run contrary to the spirit of the complaint.

Whats the difference between elusive 'X, Y Z' we can't define or measure, and 'X, Y, Z' that doesn't exist?


>So kids are worse in some 'X Y, Z', but in any of hundreds of statistics we collect they are better.

That's more of an argument that one can't really trust statistics. Or that they don't always measure what matters...

>Whats the difference between elusive 'X, Y Z' we can't define or measure, and 'X, Y, Z' that doesn't exist?

Well, we can define, and it does exist. Not sure if we can measure (not everything that exists can be measured in some meaningful way. Some things matter in qualitative ways).

So, for some concrete examples, the roman generations that followed the acme of Rome and steered it towards the decline had certain very specific characteristics. Less inclined to work for common good, more selfish, having less children (lamented by roman critics of the time, and which less to a big problem with economic dynamics, the army, and so on), more concerned with greed and self-indulgence, less interested in civic matters (which was the bread and butter of earlier generations), and so on.


Yes, some times it's correct, some times it's wrong.

The complaint itself is always there (absolutely always), and should be dismissed every time because it carries no information at all. This does not technically make it a logical fallacy (ironically, most logical fallacies carry more useful information), just a useless waste of time.


>(c) be sometimes true and sometimes true

Of course meant "and sometimes false".


I once was a Physics student and recall both grading regimes. There were grading practices on both ends of the spectrum from the average grade being a 50% on a test to the average grade being 95%, with the former camp being a minority of professors.

Having a grading philosophy that boils down to pass/fail A-F distributions is great from the perspective that there are students who paid for a class and the professor ensured that they had acceptable knowledge by the end of the course. The wide grade distributions tended to be hated by most students, but the few folks who scored well could make up for bad homeworks etc.

Now the curious thing about these two grading approaches is that I always recall the material of the classes with wide grade distributions with more fidelity than those graded on 100% or bust curves (which is saying something after 11 years). In hindsight I believe the "harder" problems sets lead to deeper learning as once could always see what the next skill progression was and incentivized taking risks on spending an extra few hours studying or working a problem to try and get a better score. I also wasn't afraid of coming up with a wrong answer through brave effort vs. looking for a pre-baked solution.

The Featured Article's core claim is that by adopting the 100% or bust style of grading that students optimize to avoid errors. This strongly incentivizes students to crib off of each others work, past homeworks, and other online sources to ensure that they've identified a truly correct answer. It also forces professors to keep problems closer to the book/lectures such that students are largely guaranteed success given some amount of student effort.

It could be that the older method of grading wasn't doing its job by selecting geniuses - but by incentivizing students to take risks. Is it that surprising that students who spent 16 years of school optimizing to avoid mistakes are extremely risk adverse?


Each style has its place. Just like you, I found the "harder" problem sets and related classes much more interesting and exciting. That significantly correlates with me being in this field after all. That is also a poor style to pick for an intro class, as the intro class needs to provide an overview of the field and teach many diverse skills in relatively minimal depth. The intro classes are not meant to teach mastery. Good schools pay sufficient attention to their students to bump up the occasional student that is taking the intro class instead of the more appropriate for them mastery class.


I always shake my head when confronted with "rationalist" positions. They often consider important questions and issues but just as often mix these considerations with remarkably immature and historically ignorant thinking. Which isn't to dismiss them but it seems there's barrier they need to overcome.


> I always shake my head when confronted with "rationalist" positions.

Rationalism is almost by design going to be either different logic or a different position from what seems and feels like a good idea. That is why it gets its own word.

So while overt rationalists are often bad communicators, it is quite possible that what you are identifying is inherent in the message anyway. If the rationalists agree with common sense then there isn't much to talk about.

Plus it isn't rational to identify as a rationalist, so the really good rationalists are hard to spot.


> Plus it isn't rational to identify as a rationalist, so the really good rationalists are hard to spot.

Depends whether you're valuing broad acceptance or a tight-knit peer group of people who get you.

Though of course the real trick would be to have both, this can be hard in our omnisurveillance society. I just don't think it's a universal law that people who are easy to spot are bad at it. Certainly people who are bad at it are easy to spot.


Most “rationalists” have a barrier to adulthood they need to overcome…

That Scott guy is/was somewhat problematic, but recognizably smart. It’s one of maybe five people where I consciously notice someone’s intelligence.

The rest of that community is toxic and misguided, starting with the idea that emotions are somehow “bad’.


Recommended link: https://www.lesswrong.com/posts/z9hfbWhRrY2Pwwrgi/summary-of...

I cannot disprove that individual rationalists believe this, but to my knowledge it is not a community opinion.

The problem with emotions arises when you attach emotional valence to beliefs and conclusions, rather than outcomes.

Inasmuch as rationalists can be seen as down on emotions, it is because lots of people allow emotions to distort their thinking about reality. In other words, rather than having emotional preferences over outcomes, they have emotional preferences over beliefs and modes of reasoning. (That's why a core of rationalist technique (short [1] long [2]) is learning to emotionally tolerate the possibility of being mistaken.) This is indeed antithetical to rational thought, but it's hardly the only thing emotions are for.

There is nothing wrong with emotions. They just have no place in the process you use to arrive at beliefs about reality. In other words, rationality rejects the Pratchettian claim that you have to believe "the big [lies]: justice, mercy, duty." [3] Instead, it reframes those as evolved game-theoretic adaptations that are part of the iterated game you are playing with your peers, called "society." This places emotion in its proper context without invalidating it. Quite the opposite, it views emotion as a vital, useful part of human interaction and decisionmaking, rather than the generic default of "just part of what it means to be human".

[1] https://www.lesswrong.com/tag/litany-of-gendlin

[2] https://www.lesswrong.com/tag/how-to-actually-change-your-mi...

[3] https://www.goodreads.com/work/quotes/583655-hogfather


> Instead, it reframes those as evolved game-theoretic adaptations that are part of the iterated game you are playing with your peers, called "society." This places emotion in its proper context without invalidating it.

To me, this leads to almost indistinguishable outcomes as the straw vulcan.


I don't see how it would. If you make me angry, this doesn't lead me to say "I should not be angry", or to deny that I am angry, or to pretend that I'm not angry; rather the opposite, I will (best case) understand both why I am angry and why I should be angry; what the point of being angry is.

If that's a straw Vulcan, we have moved very far away from Mr. Spock.


> [rationality] reframes [emotions] as evolved game-theoretic adaptations that are part of the iterated game you are playing with your peers, called "society."

Do the rationalists realize that this is fundamentally a metaphysical or even theological argument?


I disagree; I think it is fundamentally an empirical argument. The source of our convictions should have no impact on their validity, which is the realm of morality. In doing so, rationality does not seek to disprove emotions, which would be nonsensical, but rather to justify and contextualize them by understanding the competitive pressures that gave rise to them.

In other words, the idea is that in understanding the game theoretic strategy that our brain is executing when we feel things - "get where it's coming from" - we can bring our emotions into harmony with our conscious beliefs, and either support our unconscious brain in its emotional reaction or reign it in where the evolutionary context is no longer valid.

(I'm not sure where you're coming from with "theological.")


In order to say things like "my brain" one is necessarily creating a distinction between themselves as thinking subject and a "lower self" which consists of a biological substrate. Of course, this is common practice in folk epistemology, but in order to achieve this, one needs the support of an "objective" frame of reference, which can only be achieved through a theological justification (or in this case cosmological, through reference to evolutionary processes).

This was an issue that Kant grappled with, in that the mind cannot distance itself from itself. When we "understand the game theoretic strategy that our brain is executing", we are still implicated in other "game theoretic strategies", we've just made certain ones manifest and chosen to ignore others.

That said, I still support the end-goal of "bringing emotions into harmony with our conscious beliefs", but often that requires turning in the opposite direction: asking questions to the emotions themselves rather than to the world, which requires an entirely different conceptual toolkit (e.g. psychoanalysis, Buddhist thought, etc).

I view evolutionary psychology as in the category of psychological divination techniques, in which a set of predetermined systematic constraints allow you to locate interesting patterns in your day to day life through a process of reorientation. In all sincerity, I suggest astrology as an alternative technique that avoids any normative connotations of "fitness" etc.


> In order to say things like "my brain" one is necessarily creating a distinction between themselves as thinking subject and a "lower self" which consists of a biological substrate.

Come on, that's a stretch. I think it's clear from context that this is referring to unconscious vs conscious awareness, in the like one case where I failed to specifically highlight this distinction (because I wanted to head off just this claim). Both of course happen in the same slab of meat.

> This was an issue that Kant grappled with, in that the mind cannot distance itself from itself. When we "understand the game theoretic strategy that our brain is executing", we are still implicated in other "game theoretic strategies", we've just made certain ones manifest and chosen to ignore others.

Sure, but consider: [1] [2]. Of course conscious deliberation is just another strategy, but it's a strategy that benefits from clarity. There is no good reason why reason and deliberation should be unable to close over itself, let alone as "separate" a component as emotional state.

Reason is a strategy that rose to some prominence in human evolutionary history because it is extraordinarily effective at making sense of the natural world. (Note: There's some claims that reason rose to prominence because it made humans good at making up sensible claims to deceive others - but this sort of evolutionary history would still produce a system that's good at inference, if only as a side effect. Compare GANs.)

> that requires turning in the opposite direction: asking questions to the emotions themselves rather than to the world, which requires an entirely different conceptual toolkit

Now who's doing dualism? The emotions are part of and arose from within the world. Without looking at the historical context in which they evolved, you may not be able to even understand what they're trying to tell you.

And sure, a lot of evpsych is just-so trash. But especially around the game theory of emotions, I think the conclusions hold up.

[1] https://www.lesswrong.com/posts/46qnWRSR7L2eyNbMA/the-lens-t...

[2] https://www.lesswrong.com/posts/XPErvb8m9FapXCjhA/adaptation...


> There is no good reason why reason and deliberation should be unable to close over itself, let alone as "separate" a component as emotional state.

It can be done, but most rationalists never achieve this, in that it requires understanding the cause of emotion as it relates to one's own subjectivity. This often involves admitting hard truths about oneself, and is the process that psychoanalysis is intended to support. Instead they "rationalize", and come up with explanations like that one that follows, which don't understand the emotion so much as "explain it away" through what amounts to, as I said earlier, a metaphysical argument about the causal origins of the species.

> Reason is a strategy that rose to some prominence in human evolutionary history because it is extraordinarily effective at making sense of the natural world.

This sort of causal argument is why I tend to cringe at rationalist theory, because not only is it not necessarily true, there's no way to even tell if it were true.

> The emotions are part of and arose from within the world. Without looking at the historical context in which they evolved, you may not be able to even understand what they're trying to tell you.

Ironically I fully agree with this statement, except what I would mean by "historical context" is "the events in your own specific life that influenced your current sentiments", rather than history viewed as a totality. Again, a reference to the totality and causal origin of life; do you see why I describe it as theological or metaphysical? This is very similar in structure to Christian theories on the topic of e.g. original sin.


> This sort of causal argument is why I tend to cringe at rationalist theory, because not only is it not necessarily true, there's no way to even tell if it were true.

I mean, to be fair I'm leaning myself out the window a lot with that claim. But it seems plausible, and more importantly, the argument is resilient to why reason arose. I'm not saying this in a sense of "this is definitely why we have reason, and that's why I am right"; I'm more saying "even in this case where we don't necessarily have reason because it's useful," (note the alternate theory I list) "that is still no reason (heh) to expect it to mislead us." And I think we can confidently say that reason is useful today, at least, and submit as evidence approximately the last 10000 years of planetary history. I think it's not a big stretch to claim that the reason that reason is useful today may be the reason that it evolved to begin with.

> Ironically I fully agree with this statement, except what I would mean by "historical context" is "the events in your own specific life that influenced your current sentiments", rather than history viewed as a totality.

And again, I don't see why it has to be one or the other. Of course, even and especially in "events in your own life" it is ironically very easy to make up pat just-so stories to justify any reaction or sentiment. In comparison I honestly think evpsych is on a firmer footing, particularly given I don't think psychotherapy needs particularly to be correct about the causes it posits in order to function as therapy. Whereas if you are mistaken about the evolutionary cause of an emotion, you will completely misconstrue it and probably behave badly.

> It can be done, but most rationalists never achieve this, in that it requires understanding the cause of emotion as it relates to one's own subjectivity. This often involves admitting hard truths about oneself

For instance, I am very leery of the notion of "hard truths", because it imbues an argument with value based on the emotional work performed in reasoning it out. This is the exact sort of thing rationality says should not be expected to work. - Something does not become more or less true by being hard to admit; an untruth can be as hard to admit as a truth. You need to have a cognitive mechanism that is in principle capable of producing truths to begin with, in order to even arrive at a hard truth that you can emotionally reject. Without that, you're liable to just admit hard made-up claims as a form of catharsis.

Is something being hard to admit evidence that it is true? Yes, but weakly - if it weren't convincing enough to outweigh being hard to accept, you probably wouldn't even entertain it. (How to be convinced by things that are true but hard to accept is actually a core tenet of rationality - see How To Actually Change Your Mind linked above.) But you still have to do the work to arrive at a true belief about yourself to begin with.


> "even in this case where we don't necessarily have reason because it's useful,... that is still no reason (heh) to expect it to mislead us."

There are plenty of cases of reason misleading, going wrong, perhaps as many or more as reason going right. But of course, judgment of right and wrong depends on your criteria, and reason will never help you achieve that which is prior to reason: faith, in the most basic form.

But, in rationalist theory, by positing itself as the means and also the end (we use reason to get closer to reason; we have faith in reason), the term forms the sort of self-sustaining loop that Zizek (or really Lacan) calls a "master signifier", and is also the essence of all theology. All other considerations are shoved under the bus, or else dictated by a sort of bland evolutionary ethical theorizing, like "minimizing suffering", without really interrogating what that even means (if they looked into older philosophy, they would see that thought itself could be considered a form of suffering). It is not necessarily bad to hold to a master signifier as such, as all faith exists in this form, but the rationalists seem especially unaware or unable to interrogate their beliefs to this level.

> I am very leery of the notion of "hard truths", because it imbues an argument with value based on the emotional work performed in reasoning it out.

This is a misunderstanding of what I meant to express, and I apologize for using that term. In my experience (and in the psychoanalytic literature), "hard truths" are "hard" because they involve noticing objective aspects of ourselves that we were previously blind to. The recognition often comes in a flash, with a sense of relief, rather than through some sort of laborious process (of course, the preparation for that flash might well be a laborious process of re-learning how to look!). The "hard truth" is being able to admit something like "actually, I'm not just out to get the truth. I want social recognition after all."

> Something does not become more or less true by being hard to admit; an untruth can be as hard to admit as a truth

Such a focus on truth in relation to psychological affairs feels, to me, wrong-headed, in that all truth is contingent, grounded in some act of judgment. Don't read this the wrong way: contingent != relative. It just means that we need to address the question of "how do I know that true things are true?".

The cultural impact of the enlightenment, especially of Kant, was to collapse "truth" into an intersubjective agreement based on sensory observation ("science"), but how can you achieve intersubjective agreement in affairs which are only observable to a single subject, such as "my emotions" (i.e. "how is psychology possible?")? This is where psychoanalysis had its real epistemological innovation: free association as a technique is a way of generating shared knowledge about what was formerly private. Christian "confession" functioned similarly. Modern behavioral psychology is several steps behind. But with or without such a technique, one can never really achieve the absolute (gnostic) truth about oneself, at least not without a strong frame that answers the question "under what circumstances is a given statement about myself true?"

It seems better to ask the question of "what do I want that I'm having difficulty achieving? And what factual observations about myself can I make, which demonstrate patterns that prevent me from achieving this?" Notice how this moves the direction of thought (of reason, even!) away from "finding the truth" and toward a more environmentally-oriented mode of "understanding", closer to what Plato meant by that term, with its connections to "wisdom". The "hard truth" here as I mean it, is noticing sites of competing desires, as Spinoza and Freud do in their discussions of "ambivalence", and then being willing to take a stance or a risk that attempts to resolve the tensions.

With that in mind...

> I don't think psychotherapy needs particularly to be correct about the causes it posits in order to function as therapy. Whereas if you are mistaken about the evolutionary cause of an emotion, you will completely misconstrue it and probably behave badly.

This is a feature, not a bug, because as I said earlier, the goal isn't to discover some absolute truth about yourself, but to solve problems. Evopsych explanations claim to provide truth, but actually serve to solve problems, through some intellectual sleight-of-hand.

Similarly, the use of reason in thought serves to solve problems, meet individual needs, even if framed as "uncovering the truth". This + my deep involvement in the rationalist world for several years is why I feel comfortable making statements like this one from a few days ago: https://news.ycombinator.com/item?id=28198926


> There are plenty of cases of reason misleading, going wrong, perhaps as many or more as reason going right.

Right, but note what you are saying: reason going wrong.

I don't think I even need to add anything there. Your phrasing already reveals that there is "a rightness ... by which it may be judged" in reason that there is not in faith.

> But, in rationalist theory, by positing itself as the means and also the end (we use reason to get closer to reason; we have faith in reason), the term forms the sort of self-sustaining loop that Zizek (or really Lacan) calls a "master signifier", and is also the essence of all theology.

I agree that reason is ultimately circular. I even agree that religion can also be circular. (I disagree strongly that all religion is circular, or that religion is inherently circular, or even that religion inherently hinges on faith.) Nonetheless, I think that where religion is circular it is so in a different way from reason. When we reject the circularity of religion, we do so not because it is inherently wrong to posit a God that justifies everything, but because it is not useful. The very self-supporting structure of the faith, the retreat to an insulated core element, prevents its adherent from doing any cognitive or predictive work with it. Rather, I would argue the circularity of religion is a consequence of reason, a retreat into self-justifying tautology prompted by the failure of religion to compete with reason on its own grounds.

In other words, reason closes on itself in a way that shapes itself to a particular inherent standard ("that seems to be associated in some way with success in interacting with the natural world", my reason throws in, giving yet another reason why it is judged valuable under itself, argumentatively achieving nothing yet undeterred.) However, religious faith does not even have any such constraint.

> bland evolutionary ethical theorizing, like "minimizing suffering"

I don't know why you are talking about suffering. I was not talking about suffering. I have no idea where you are going with that.

Evolved animals don't seek to minimize suffering. They don't even seek to maximize propagation; they're just the result of a process of relentless selection for things that propagate.

> "hard truths" are "hard" because they involve noticing objective aspects of ourselves

I'm sorry, you were previously accusing rationality of putting too much faith in reason - and now we're supposed to just notice objective aspects of ourselves? With what absolute source of truth, exactly, are we supposed to do this?

> The recognition often comes in a flash, with a sense of relief, rather than through some sort of laborious process

Now, I myself am as much of an addict to this sort of sudden endorphin-supported flash of insight as the next nerd. But - though it is an indicator of a simple theory that supports the facts observed, that does not in itself suffice to establish truth, let alone "objective truth"!

I have a new theory: lots of rationalists are programmers because programmers are used to having the sudden flash of insight be wrong.

> "how do I know that true things are true?".

I mean - rationality in particular does not engage in any attempt to establish truth, so I feel this charge somewhat misses its mark. Rather, rationality seeks to improve the quality of one's beliefs by engaging in processes that interact with the presumption of an ordered natural world in order to come to beliefs that are, given that assumption, more likely to be true.

That's rather why it's called "Less Wrong" rather than "Objectively Right".

> but how can you achieve intersubjective agreement in affairs which are only observable to a single subject, such as "my emotions"

I have a rejoinder to this called "you are not a unique and beautiful snowflake" and also emotions are totally observable to others, like, it's usually not even hard. Doesn't even take blood tests or brain autopsy. When some dude beats my face in with a bottle cause I fucked his wife, I do not have a particular hard time parsing his emotional state - or the evolutionary reasons behind it! Emotions are interesting because they have effects, on yourself (the brain is part of the natural world!) but also on others through you. Honestly, the idea that emotions are primarily individual is just so weird to me. They seem one of the most shared aspects of humanity there is.

> It seems better to ask the question of "what do I want that I'm having difficulty achieving? And what factual observations about myself can I make, which demonstrate patterns that prevent me from achieving this?"

From what you're saying, I take it you have observed rationalists failing at this. But I mean, going just by the text and the people I know, this just seems like solid rational thinking. I don't see any contradiction here.


> I don't know why you are talking about suffering. I was not talking about suffering. I have no idea where you are going with that.

I brought this up because the EA community seems very closely associated with rationalist ideas. Although I'm sure there are other rationalist conceptions of "the good" out there, I've yet to see them clearly articulated.

> ...emotions...

Perhaps the word I should have used is "desire", as emotion springs from desire, and although specific emotional states are easily observable, the underlying desire is what often remains hidden, subject to the other's fantasies. But it's an aside at this point.

> That's rather why it's called "Less Wrong" rather than "Objectively Right".

I'm glad you brought this up, because it raises what I see as the key question, the crux of the matter: "Less Wrong" relative to whom? And for what end?


> I brought this up because the EA community seems very closely associated with rationalist ideas. Although I'm sure there are other rationalist conceptions of "the good" out there, I've yet to see them clearly articulated.

The EA community aren't even unified in minimizing suffering. Most of the formulations I've seen are about maximizing good life, though there are certainly people who care about suffering in the community. (I think it's mostly Tomasik.) I haven't seen any of them try to justify their preferences with evpsych though.

> I'm glad you brought this up, because it raises what I see as the key question, the crux of the matter: "Less Wrong" relative to whom? And for what end?

Once more: Less Wrong relative to a presumed-ordered natural world. To the end of shaping this world to be more in line with one's preferences.


> Less Wrong relative to a presumed-ordered natural world

The "natural world" doesn't have beliefs, the world itself cannot be right or wrong, it just is. So the question again is, Less Wrong relative to whom?

> To the end of shaping this world to be more in line with one's preferences.

And what if one is unsure of one's preferences?


> The "natural world" doesn't have beliefs, the world itself cannot be right or wrong, it just is. So the question again is, Less Wrong relative to whom?

What?

Okay, let me try really small steps.

We assume the natural world is ordered.

We assume that the brain holds beliefs, which are compressive/predictive patterns, about the observations of the natural world.

Inasmuch as these are predictive, and inasmuch as we can take multiple actions, then a belief can be qualitatively better or worse in selecting actions that make the world be more good by one's judgment.

Beliefs can fail to be appropriate to select actions on (at least) two metrics: they can be "right or wrong", or they can be "more or less useful."

An example of a wrong belief: if you think it is raining, and so you bring an umbrella, but it is not raining (the belief is wrong), then you have failed to improve your position: you are dry anyway, but now you are also carrying an umbrella, which is annoying.

An example of a useless belief: you can believe that your shirt has more fibers than the average shirt of its kind. This may be true, but even if it is true, there is no decision that you would make where the belief is instrumental in deciding which of the possible actions leads to a world that better matches your preference.

(Unless you have very specific preferences about clothing.)

The goal of becoming Less Wrong is for your brain to contain fewer beliefs that are wrong, and more beliefs that are right; and also (but less importantly) fewer beliefs that are useless, and more beliefs that are useful. The term "Less Wrong" does not refer to a comparison to some other existing human, but rather to the counterfactual outcome of failing to reflect on your thoughts - Less Wrong is a collection of techniques whose goal is, when applied, to make your brain have less wrong beliefs and less useless beliefs, in order to improve your ability to shape the world to your preferences. (Whether they actually achieve this is a hot topic of debate - but that is the standard which they set.)

> And what if one is unsure of one's preferences?

That depends on your preferences for your preferences. :)

In any case, setting your preferences is not the purpose of rationality. (Though as your preferences are part of the ordered natural world, rationality also aims to improve your ability to shape your preferences, this does not tend to be a central consideration.)


So, to be perfectly clear, the main reason for studying rationality as you explain it, is the belief that many individuals are not skilled at determining whether their beliefs about the observable world are right or wrong (as in your umbrella example), and that this is something they would benefit from working on?


Well, most people can generally recognize when a belief is wrong if they are directly confronted with the fact. I think it's a bit more meta - such as creating modes of thought that systematically (again, holding in mind the assumption that nature is regular) produce right and useful thoughts, and dismantling modes of thought that systematically produce wrong or useless thoughts.

But in essence, yes.

(Though of course, again, the ultimate goal is to be better able to achieve preferred outcomes in the world, and if thinking only the word "blue" on repeat produced outcomes perfectly in line with your preferences, then rationality would espouse blue blue blue blue. But at that point, most of the techniques gathered under the label would be useless, because yadda yadda ordered world etc.)


Thanks for sticking with me as I clarified, I see the crux of our disagreement now.

Basically, I am personally of the belief that any "wrong" or "useless" beliefs that one holds are always actually useful, as all belief is instrumental. Ignorance is maintained because it serves a purpose.

I am also of the belief that the only way to determine why one holds a wrong belief is to understand the "emotional" or desire-based reason that underlies or supports it. Hence why I spent a lot of time above discussing emotions. We have seen in recent times the repeated failure of individuals to adjust their beliefs even in the face of overwhelming evidence. Why is this? My stance is that their "wrong" beliefs serve deep emotional needs for them. Most people have stronger desires than the desire to arrive at truth in their evaluation of the world.

The difference in emphasis here is shifting from "the brain as a prediction-making device" to "the human body including the brain is a system that acts to relieve tensions placed upon it", and that prediction-making is one biological strategy for doing so. Of course, better predictions can and will relieve more tension than worse predictions, but my stance is that being able to conceptualize the causes and nature of tension directly will lead to better outcomes, as it permits stronger meta-cognition.

I see the term "preference" doing a lot of work in rationalist thought, and I generally taboo it. The term comes from economics and fails IMO to adequately describe the nature of desire. But if I were to use the term in this instance, I would claim that disentangling one's preferences from internal conflict is the first step one should take prior to tackling the problem of rationality (because we need to know what we even want to get done before we figure out how to do it). This involves interrogating our beliefs about what our preferences are and why, which is something that the rationalist focus on the "world" seems uniquely unequipped for dealing with.

I hope this all makes sense!


Yeah that makes sense. I really think where this is running apart is that... like, a lot of the people driving the rationalist project are in environments where holding strategically correct beliefs isn't just useful but vital. Eliezer famously believes on a gut level that he once held a wrong belief and that if he had acted on that wrong belief, the world would have ended. I don't know how that relieves a tension, but I can see how it would make you really really wary of cognitive mistakes.

(I'm having a lot of fun with this debate too, it's really forced me to firm up my understanding of faith and reason.)

> But if I were to use the term in this instance, I would claim that disentangling one's preferences from internal conflict is the first step one should take prior to tackling the problem of rationality (because we need to know what we even want to get done before we figure out how to do it). This involves interrogating our beliefs about what our preferences are and why, which is something that the rationalist focus on the "world" seems uniquely unequipped for dealing with.

I agree with this take! Though I think it's less that this is something rationality cannot do as that, well, Eliezer when he was writing the Sequences, had an overriding belief that put all his other preferences at risk. This tends to focus one's attention on strategy. :)

The most illuminating post on this, for me, is https://www.lesswrong.com/posts/SGR4GxFK7KmW7ckCB/something-... . You see this notion of "rationality is a tool to come to conclusions when the conclusions really really matter" all over his writing. I especially like the phrasing in https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-de... :

> No one begins to truly search for the Way until their parents have failed them, their gods are dead, and their tools have shattered in their hand.

Which implies that to most people, the pursuit of individual rationality, possibly even above mental wellbeing, is counteradaptive.

(Man, does Eliezer have a way with words though.)

(To me, personally, the idea of holding wrong beliefs is like a persistent itch or annoyance. I do not have a driving need, but I also have few attachments. And, of course, I am a nerd, and this is all nerd cocaine. All that sums up to create an ideological space in which I am comfortable.)


Thanks for sharing those two posts. These might be instrumental in helping a (more rationalist-leaning) friend and I move forward in a years-long debate about ethics, lmfao

This part particularly resonated with me:

> Your trust will not break, until you apply all that you have learned here and from other books, and take it as far as you can go, and find that this too fails you—that you have still been a fool, and no one warned you against it—that all the most important parts were left out of the guidance you received—that some of the most precious ideals you followed, steered you in the wrong direction—

> —and if you still have something to protect, so that you must keep going, and cannot resign and wisely acknowledge the limitations of rationality—

> —then you will be ready to start your journey as a rationalist.

My trust broke years ago, and I found the rationalists, and then I realized, unlike Yudkowsky (I guess?), that I didn't have something to protect. So, my personal goals are in that sense very different.

In my experience with the rationalist community as well, I found that most people I met didn't have a specific personal motivation beside their beliefs surrounding x-risk, and I personally find the set of axioms required to accept rationalist beliefs on x-risk a bit... difficult to swallow.


They aren't bad, they just shouldn't control you.


It's fine to let emotions control you, you just need to be aware that they are and make sure it's the right emotions controlling you.

I'm happy to continue living in California because emotionally: * I like the weather * I like living close to mountains and beaches and deserts * I like my house and I'm too lazy to move

Rationally I should take the 10% pay cut to go full remote and move to a suburb of Indianapolis.


Speaking as a part-time rationalist, it is not a rationalist position that there is such a thing as "bad goals". At most there may be inconsistent goals. The Utility Function Is Not Up For Grabs https://www.lesswrong.com/tag/the-utility-function-is-not-up... .

If under your goals living in California is optimal, then it is rational to live in California.

The point of, for instance, EA is not that giving to homeless people in New York when there's starving children in Africa is a bad goal- it's that it's an inconsistent goal when one also professes that human lives have equal value. And when one wants to live by that tenet, then EA offers better routes to optimizing lives saved.


This implies that satisfying your emotional needs is irrational, which is not a healthy frame of mind.


And yet, it is the rationalist ethos. This was GP's point - many rationalists believe some quite misguided things.


It isn't the rationalist ethos. Your emotions should obviously guide you, they're a part of you and important evidence.

What you described is exactly how rationalists think emotions are best wielded: you look at your/a situation and look at how you feel about it. They're your barometers for whether that which you believe to be your goal is actually your goal.

Emotions are misused when you treat factual beliefs as goals: then your emotions get attached to them, in unhealthy and cult-like ways.


Many people believe many quite misguided things.

Rationality is one approach to trying to believe fewer misguided things.

That doesn't make it easy and it doesn't mean that success rates at doing so aren't still wildly variable.


Maybe GP means that satisfying emotional needs could be the rational choice.


What would be the point of rationality otherwise?

It doesn’t decide your goals, it just helps you reach them. The number of strawmen in this thread is far too high.


> It's fine to let emotions control you, you just need to be aware that they are and make sure it's the right emotions controlling you.

Ok, I'll bite. If it's the wrong emotions controlling you, what do you do then?

Please don't suggest any course of action that requires you to be in control.


one third of americans have a college degree; this number used to be in the low single percentages. do you think we've all gotten that much smarter?


Yes, by education.

Most people could get a college degree. It's mostly hard work. You don't need to be a genius. The few geniuses stay and do research, the normal people got out and apply what they learned to the world.


Mostly agree, but "The few geniuses stay and do research" is probably not true. I don't like the term "genius", but many of the best and brightest are put off by academic elitism and bureaucracy and leave, while at least some not particularly gifted individuals stay because they are more comfortable to play the academic game. One could even go so far to say that academia needs both of these types, but arguably, there are too many "game players" at the moment. (And of course, this is an oversimplification, and most academics try to play the game AND do useful research, but to different extents.)


This was more meant as an ideal than the description of the status qou. I agree with what you are saying.


What are some of the benefits of having game players in academia over a pure collection of dedicated nerds?


Somebody has to write the grant proposal. That's the person who will excel the quickest, regardless for talent in the underlying discipline.


> this number used to be in the low single percentages.

1970 or so it broke through, for reference.

> do you think we've all gotten that much smarter?

Yes but not in the way you imply. Having access to enough accurate information and being able to spend time efficiently to accumulate it, has catapulted large swathes of people into investing in a college education. These people who would have been stunted along the opportunity, genetic, and economic axes in the past. This is despite the horrendous increases, in costing, over the last 30 years (imo).


> Having access to enough accurate information and being able to spend time efficiently to accumulate it, has catapulted large swathes of people into investing in a college education.

factors you're leaving out that i find much more persuasive -- low interest rates, federally guaranteed loans, credentialism/professional guild dynamics. a degree costs more and gets you less than it did in the past, but you're forced to get one to have most professional jobs. it's an obligate gatekeeping tax, the piece of paper that says you're allowed to be middle class.


In the America that sent a man to the moon, only 5% of people had a college degree.


Contemporary USSR had 8%. That clearly explains why it hand't made it to the Moon!


If you factor in availability of resources, it is nothing short of amazing what the USSR achieved and how long they could stand up to the US. They probably would have put someone on the moon too, but that wasn't too important after the US was first.


The USSR achieved what it did at tremendous personal cost to its citizen. (Imperial Russia was basically just as harsh.)


Smarter at what? For many things, yes, and for other things, it's the same.


The number of children of rich people with college degrees was always high, and not because intelligence is heritable, but because money is.


Then again, intelligence is also heritable.


Then again, wealth also helps with the collegue degree if the heredity part of intelligence isn't working out for you that well.


We also inherit the universe. So we inherit a world in which certain qualities will make you better or worse off.


in the last four decades college has become orders of magnitude more expensive while wages have stagnated and almost all economic growth has gone to the very top, at the same time that the portion of the population with a degree has risen steadily


Well, it's because of the money system. People were told that working hard and smart is the path to success because people believe in the absolutism of morals even though there is no such thing in economics.

The acquisition of degrees is just a form of specialization to avoid the effects of the current money system. The reason why large cities are so rich is that international investors prefer established cities. When parents put money into investments and their children follow that money out of their hometown into a large city. Considering how much we do to compensate the bad influences of our money system I seriously wonder when nations are going to change it. Probably in 2030.

The problem isn't "free market" or "capitalism" although the adherents of those ideas tend to love the existing money system. Of course there are also other non reproducible assets like land and patents but I feel like properly taxing land should be more than enough. Patents only slow down progress, they shouldn't make things worse.

There are two fundamental problems: 1. The velocity of money keeps going down. It's not circulating at all. Meaning fiat currency is becoming less and less a medium of exchange and instead turns into a store of value just like Bitcoin. 2. Positive interest only works with either a growing supply of money at constant velocity or an increase in velocity at fixed supply. Basically inflation via endless debt growth if there is no productivity growth or population growth.

Combine this with the rise in neoliberalism in politics which basically runs into the contradictions of money at every corner. Global trade? Causes deflation which lowers interest rates. Less regulation? Businesses become more efficient. More deflation. Austerity? More deflation. Union busting? Lower incomes mean more deflation. You see, at some point you have created more deflation than the money system can survive. What breaks isn't the economy, it's the people that end up breaking. To be fair to neoliberalism. If there wasn't this pesky zero lower bound on interest rates it could have worked.

I've read an article about the "negative impacts" of negative interest rates and it mentioned a complete subversion of the standard response regarding competition for interest rates. Countries will end up competing to have the lowest negative interest rate. Isn't that absurd? Countries like Zimbabwe, Greece or Turkey who are net borrowers would see negative interest rates that would pull their economy out of the gutter and yet it is considered a negative effect. It's actually weird how that article never mentioned inflation, as if it was dead for some reason.


Maybe if you define smartness as academic smartness, which is learned on a course.

Practical smarts has probably declined as a result.


>Practical smarts has probably declined as a result.

I know the stereotype, but is there any evidence that “book smarts” and “street smarts” are mutually exclusive?

Is it possibly an artifact of an economy rewarding hyper-specialization?


It's an artifact of wishful thinking. People will go to great lengths to avoid thinking about the fact that some people are better than others; it's the biggest "inconvenient truth" of our culture.


>It's an artifact of wishful thinking. People will go to great lengths to avoid thinking about the fact that some people are better than others; it's the biggest "inconvenient truth" of our culture.

I'm not sure if the whole book smart / street smart thing is a good example of what you're saying. Wouldn't that just be an example of people becoming well adapted to their environment (or in fact manipulating their environment)?


I’m not taking innate ability here - which is something else - just there is finite time to learn so 3 - 6 years focused on academia is less time focused on other things.


Ok, I think that aligns with my statement about specialization. With finite time and incentives to specialize, I wonder if it pressures people against being generalists. I’ve often thought common sense is borne out of having many different, diverse experiences.


All models are wrong, but that doesn't stop them from being informative. Looking at things like the Revolutions of 1848, it's easy to see how the idea applies, as so many of them were driven by University students who had no money, had no prospect of being absorbed by the existing power structures, and were not allowed any political participation. The result? Revolutionary agitation.


Honestly there's nothing particularly novel in this thesis. It's more a less a restatement of anacyclosis or the principle behind Hesiod's Ages of Man.


The article doesn't seem great, but I might as well dump my favourite quotation about elite overproduction, from /Nationalism/ by Elie Kedourie http://www.amazon.com/Nationalism-Elie-Kedourie/dp/063118885... http://www.worldcat.org/title/nationalism/oclc/27812918 , about conditions in Germany around 1800:

"The writers who invented and elaborated the post-Kantian theory of the state belonged to a caste which was relatively low on the social scale. They were, most of them, the sons of pastors, artisans, or small farmers. They somehow managed to become university students, most often in the faculty of theology, and last out the duration of their course on minute grants, private lessons, and similar makeshifts. When they graduated they found that their knowledge opened no doors, that they were still in the same social class, looked down upon by a nobility which was stupid, unlettered, and which engrossed the public employments they felt themselves so capable of filling. These students and ex-students felt in them the power to do great things, they had culture, knowledge, ability, they yearned for the life of action, its excitements and rewards, and yet there they were, doomed to spend heartbreaking years as indigent curates waiting to be appointed pastors, or as tutors in some noble household, where they were little better than superior domestics, or as famished writers dependent on the goodwill of an editor or a publisher."


That's happened to several of the Arab oil states. Lots of educated people, but not with practical skills.

Egypt has a huge overeducation problem.[1] For several decades, Egypt had a policy that anyone who got through college could get a government job. Then the oil ran out.

[1] https://www.brandeis.edu/crown/publications/middle-east-brie...


Thanks for the link, interesting read. I had no idea Middle Eastern countries have such major issues with overeducation/underemployment. I'm used to a model where state assigns the number of university admissions based on job market demands, so I didn't imagine this kind of almost unlimited growth in degree holders.


At one point a lot of people only got school up to 8th grade. Then eventually everybody is going to school up to the 12th grade. Maybe in the future it will just be customary to go all the way through 4 years of college.


So when people say university degree, is it usually Bachelor's degree then? I'm assuming Master's so 5-7 years. That's the default in Nordics, I don't know a single person with a Bachelor's.


In the US at least, Bachelor's is much more common than a Master's. It depends on the area of study I bet. The views I grew up around about education treated Master's as a waste of money with a "go PhD or get a job after the 4 year bachelor" prevailing theme.


Here Master's is a requirement for quite many jobs such as teaching. Bachelor's is enough for jobs such as preschool teachers and pharmacy clerks.

Technical sciences didn't have a Bachelor's even as an option until some time ago. I think it wasn't seen as useful, and still these days Master's is the default. Tuition is free so there's no direct cost associated with studying longer.


Even having practical skills doesn't seem to make that much difference: witness all the engineers in Al Qaeda. Presumably there simply isn't much in the way of engineering jobs, either.


I wonder if Algeria is in the same boat; I want to say it has bigger literacy issues though.


At the risk of inviting survival bias, this Sounds like the situation that befell the the Bronte family. Well, at least one can make the argument that the Brontes died young because they couldnt move out of the unhealthy curate living environment.



Thanks: I copied and pasted from an older comment and forgot to repair the link.


So basically all the pre-barista degree people graduating from humanities departments?


When I was living in New Zealand twenty years ago, I often took the bus to get places. One day I struck up a conversation with the bus driver.

Turns out he had a doctorate and was a published researcher in his native country. However, when he moved to New Zealand, he found no work in his field. Part of this could be attributed to academia's perpetual lack of job openings, but I always wondered how much the locals' perception of this individual - as a non-white foreigner - and his attendant social status played into it.

Society seems to still have a severe problem with using talent effectively - whether that's elite experience or unskilled labor.


I don't know if New Zealand also does this, but the United States licensing and accreditation boards don't accept a lot of foreign degrees and licenses. I noticed this in particular after the fall of the Soviet Union. When I first went to college, it was LA City College, and in some way, if your grades were curved, it was probably the most competitive school in the country, because so many students in my biology classes had been practicing physicians and researchers with medical degrees and licenses from Soviet Republics that were starting over because US medical boards didn't accept their credentials.


My uncle was a doctor in Bangladesh. When he immigrated to New Zealand, he worked as a bell hop in a hotel while getting his medical degree and license again. Now he’s a doctor in Australia.

Which is fine. A lot of these foreign degrees are pretty unreliable.


There's also an aspect of learning the health system of each country. It's not solely about the science aspect.


I know of someone who was a practicing cardiologist (or a cardiothoracic surgeon, can't remember which) in her home country, but after she immigrated to the US she had to take on odd jobs. Studying and taking US qualifications would've set her back by some years, and it's possible she didn't want to go through it.


Similar story: my wife's male nurse during chemotherapy was a cardiologist in Eastern Europe, and at the time had chosen to be a nurse rather than go through medical school again.


In New York State, you must be a citizen or lawful permanent resident to become a licensed pharmacist. No visa statuses permitted.

Now that I think about it, I do wonder if this violates trade agreements (e.g. the successor to NAFTA)…


It's sort of like the H1-B Visa exploitation in reverse.


That's why it's called overproduction - because there simply isn't demand for research in certain fields.

The issue is society producing an elite with a skill set that isn't needed.


I thought it was less about skillset than that implies.

If you structure society so that the kids of aristocrats are the only ones that have degrees in basketweaving, then you can say "to get this prestigious job you need a degree in basketweaving" and it all kind of works (for the aristocrats).

Non-aristocrats then do what they've been told will lead them into the elite, and at first a few might make it in. So it all seems to be working for everyone except those too ”stupid" (i.e. poor) to pay their way through 20 years of education in basketweaving. Meritocracy in action.

But as it progresses, and more and more people follow the path things go wrong. Not because there's a limit to the number of basketweaving professionals needed, but a limit to the number of people who can be elite in the society, and that piece of paper used to be the limiting factor (previously acting as a proxy for "are you already connected to the elite")


And this doesn't even cover the sub rosa truth that the degree was never the actual ticket to aristocracy. The degree is a necessary, but insufficient, cause of aristocratic acceptance. There will be plenty of skilled yet mysteriously impoverished basketweavers; they aren't allowed into the Serious basketweaving studios because they, er, aren't a good cultural fit. Yet, thanks to cultural factors, basketweaving has never been more popular! Bright young people pour into degree programs and reckon they'll change it from the inside! There's even good examples of that sort of thing on TV all the time!

Am I talking about public policy? Or urban planning? Or law? Or computer science? Or--


The other way of looking at it is that the degree used to be an accurate proxy for what actually mattered, like intelligence and field interest, but now with a billion people pursuing it because the money looks good, it’s not going to be an accurate proxy anymore.


The degree used to be a proxy for being upper class. But when it became available to the lower class due to loans and public schools, the degree lost its utility as a signal.


I like the way your mind works.


Suppose that the skill of the elites you produce is a normal distribution. There are only jobs for the best 0.001% of people in the field, but to get enough candidates that reach that 0.001% bar, you have to produce the other 99.999%.

You're sifting through sand to try to get the flakes of gold. There might not be a need for the grains of sand that are produced by the system, but without filling the system with them, you wouldn't have been able to extract the gold that's mixed in.

This is why elite overproduction occurs. Many of them may be perfectly competent individuals, but as long as employers are competing to hire only the best of the best, the system will have to massively overproduce to satisfy that demand.


Except that's not what is going on. Usually, when the time comes to look for a new postdoc or a new professor, there are plenty of candidates of very high talent and future potential and committees end up deciding based on low-information signals such as publication count because otherwise they'd have no clue who to prefer. Of course you want there to be more PhDs than professors to allow for an additional selection for the very top talent at that point, but the current situation is that we're producing too much sand as well as too much gold.


The place where the sand/gold analogy breaks down is that the goalposts are constantly shifting due to competition.

If I've hired the best person in the world, and they're the best out of 100,000 candidates, then if someone else also wants to find the best person in the world, they'll need to be better than the current best person in the world. It may take 200,000 candidates to find that person. To keep finding someone better than the current best person in the world, you constantly need to increase the amount of sand.

To find the new best person in the world, you'll produce geniuses who are as good as the old best person in the world, but they're no longer good enough to be the NEW best person in the world.

So yes, we're overproducing on gold, but now we're looking for something even better than gold.


This presumes (a) a highly reliable ability to identify the "best person", which we simply don't have, (b) the idea that "best" is unidimensional and unrelated to the context one finds oneself in and (c) the idea that such a "best" person will also be the one who will do most to advance science. Just take a look at a century of Nobel prize winners and you'll see that none of those assumptions hold.


The assumptions don't hold in reality, but the myth of them holds in the minds of people doing the hiring, and that is enough.


They would cost too much. You take the second best and claim they are the best person on the market.


It doesn't cost too much, because the employer doesn't pay for it; the candidates do.


A clear definition of "elite" is needed to make any sense of this discussion.

"People with Diplomas": We have many expensive diploma mills and some effective schools.

"Educated people who work in their field of study": This is an unnecessary constraint on talent and interest. Also, education does not equate to talent or interest.

"Rich People": We appear to have many, both conspicuous and inconspicuous. You're in the planetary top decile if you have healthcare, shelter, clean air, food, and water, and safety.

"Leaders": We have a system to generate simulacra that protect the corporate assets.

"Brilliant minds": Hard to produce by traditional means. We print diplomas instead.


"because there simply isn't demand for research in certain fields"

Our society has enormous demand for clean energy, but it has completely failed to fund fusion research. We have spent more on lipstick and casinos.

So physics researchers can't get decent pay while the world is going to hell in a handbasket


The rhetorical trick here is the assumed definitions of "society" and "demand".

Is there enormous "demand" for eg useless plastic garbage or non-nutritive fast food, or is that the choice-of-least-resistance offered to and taken by millions who lack the resources to consistently "choose" more "wisely", and lack the voice or knowledge to demand better choices? Or, why is food optimized for looks and uniformity rather than nutritive value?

Does "society" choose how to allocate resources, or are there a handful of insanely powerful people whose hands rest lightly on an incredible variety of policy levers that they'll never move for personal financial reasons? Or, why is Australia so intent on coal production?


ITER alone has absorbed about as much investment as the entire Manhattan project, in inflation adjusted terms. That’s just one project to produce a prototype reactor. Fusion has been very heavily funded, to lots of hype but very little evident progress. Sometimes throwing money at a problem just isn’t the answer.


Every year $731B is spent on US military, $380 B on makeup, 10.7 B on NSA, and measly $4 B on ITER by the whole world combined, of which US contribution is ~10%

In what world is this pathetic level of funding adequate, let alone heavy? Do you realise the cost of this reactor is roughly in the same ballpark with 'normal' nuclear reactors?

You claim there has been little progress, do you know that fusion has advanced faster than Moore's law?


I don’t know where you got $4bn for ITER from. The project itself estimates a cost of €18-22 by the time it’s completed, with some estimates well over double that in real terms. It’s also one of about 100 fusion reactors built since the 1950s.

Comparing the budget for one area of research to entire defence budgets is frankly just giving up on reality.


you are comparing against the entire us military budget (which includes funding its own fusion research), an entire category of consumer spending (ie, money that the government can't spend to begin with), and an intelligence agency that employs tens of thousands of people. how are any of those valid comparisons against a single (albeit important) research project that may never bear fruit?


All these efforts are less important that global survival of organised civilisation, to which fusion research is key.

Also what kind of argument is "an intelligence agency that employs tens of thousands of people"?

Does headcount mean its okay to spend money on NSA and not ITER? Has NSA born more fruit than ITER has in some tangiable way, besides violating our privacy and freedom?

You seem to be encouraging the largest possible and least efficient bureocracy.


> Also what kind of argument is "an intelligence agency that employs tens of thousands of people"?

my point is that, however you may feel about the NSA, you are comparing the budget of an entire government agency to that of a single research project. ITER is (AFAIK) the largest fusion research project, but it isn't the entire field. furthermore, fusion is an important piece of the sustainable energy puzzle, but it seems an exaggeration to call it the singular key. I just don't see how this comparison makes sense.

why not compare ITER against other research projects? I suspect it is because you know that $4 billion is already a huge budget for a single project, maybe not the largest ever, but among those.

> You seem to be encouraging the largest possible and least efficient bureocracy.

the only thing I am encouraging is a reasonable apples-to-apples comparison.


Your yardstick is totally wrong - we know how much funding is needed, and we underinvest:

https://amp.reddit.com/r/Futurology/comments/5gi9yh/fusion_i...

The laws of physics have bo obligation to us to be 'researcheable' on a level of funding you deem sufficient.

This should be priority number 1, it should be funding fusion on the same level as we fund the military and healthcare, not some second rate government agency, our future depends on it.


Most people in fusion research consider ITER a huge waste of money and is slowing down progress toward a successful fusion powerplant. The fusion research budget of the US (and other countries?) is now mostly going to ITER with money spent mostly pouring concrete and building magnets, not discovering the new science needed to make an economically viable reactor.

Maybe it is worth the money to have a large international project involving most the large world powers (China, European Union, India, Japan, South Korea, United States), but like the international space station, there is not going to be a lot of cutting edge science going on there. Especially since it is taking so long to build.

On a brighter note, maybe in part due to the fact that ITER is such a boondoggle, there are now many small (and not so small) VC backed fusion projects being funding that are testing out a wide variety of fusion concepts. High temperature superconductors help and our understanding of how plasmas work and behave is still improving at a great clip. A carbon tax would really improve the economic case for developing fusion power (and fission power, while we are at it)


Exactly, simply spending money doesn’t solve problems. The issues with fusion are not with funding but with fundamental research, but you can’t always spend your way to solutions in research either. The laws of physics can’t be bribed or influenced by financial incentives. I completely agree we need to continue fusion research, proportionately to the evidence for the specific lines of research. The fact is though, we are spending huge amounts on fusion and have done for many decades. As ITER shows simple increasing spending will almost certainly just mean wasting all or at least most of the extra money on dead end lines of investigation.


My father is a long time plasma physicist researching fusion power. I have talked to him quite a bit about these issues. The research budget for fusion has long been, unfortunately, spent on a few large projects instead of many smaller that would have likely let progress toward controlled fusion power plants happen much more quickly. We probably could have had them in the 1990's or 2000's with good planning and $3-5 billion a year (2020 dollars) in funding from the 1970's, instead of the ~$500 million that was being spent.

Very significant progress has been made though and understanding how plasmas work continue to improve. We probably understand the physics on how to build an economical fusion power plant (if there was a reasonable carbon tax), but there is a lot of complicated engineering problems that need to be figured out. Materials that can self heal under hight neutron bombardment being a major one. A Manhattan like project with the right focus, leader(s), and the ability to ignore bureaucratic red tape (radiation rules make developing anything that produces radioactivity almost impossible), might be able to do it in a similar time frame, but I don't know if the US's current culture would support it.


Massive funding for a few big projects to do a "Manhattan Project for Fusion", which I have often seen advocated here, has been the actual strategy for a long time and it's failed. As you said, most of the funding has been given to a few big projects.

With TMP we actually had several direct engineering routes to viable devices using known technology at a pretty early stage. With fusion frankly we did not. In that situation you're much better off funding a wide variety of different initiatives and ideas, and ramp up funding as the results come in.

Even a much smaller budget, but spent on a wide variety of basic research and evaluation projects would have been a far more cost effective, and almost certainly a much more fruitful approach. But maybe I'm not reading what you mean when you say "Manhattan Project" right.


The original "Manhattan Project" was to produce a nuclear bomb once the physics of sustained nuclear reactions were well enough understood to know it was possible. This level of knowledge was probably known about fusion power generation in the 1960's. As you said, more funding is not always better. It is the whole structure of the project that is important.

The Manhattan Project was organized by a military general with direct access to the president and the willingness to use it to get things done and a well known scientist overseeing the scientific side of things. They ran many different paths to possible success at the same time (U235 and Plutonium bombs, at the top level) and had about 130,000 people working on it with the urgency of war and no worries about patents or trade secrets. There were lots of bottlenecks and new technologies to invent in the process, but these problems were systematically addressed and overcome.

I don't really think any "Manhattan Project" or Apollo like program run by the government could be done today, but I don't think the US has completely lost the ability to do such things. SpaceX is making amazing progress on rockets at the moment and if they get people to Mars in the next 4-6 years, I would say they have accomplished a similar feat.


The Manhattan project also was only the start of a long series of nuclear weapons experiments and research. In fact it's still continuing. CTBT doesn't ban computer simulations so many research institutions employ supercomputers.


That's a problem with how we think about demand, isn't it?

If society needs some serious phlogiston research because of existential threats, but refuses to fund phlogiston research, can you say phlogiston research is in high demand?


If you outlawed lipsticks and make up tomorrow, not a single physicist would get bigger salary.


I don't think this is accurate. We have a need for clean energy due to global warming, but we don't have as much of a demand for it. We spend more on "lipstick and casinos" because there is a true demand for it, and this can be seen because of the money spent on it. Most people could probably get solar panels if they stopped getting fast food and coffee out for a couple years but they don't want to. They have a higher demand / want for the fast food / coffee than clean energy. Similar to why schools spend so much on their football programs, as opposed to their teachers. The demand for football is there and drives income so it gets the funding.


"but we don't have as much of a demand for it."

You don't actually know that. I can buy lipstic, and you can measure that in your statistics in X billions.

But I have also been "demanding" fusion for 15 years. I am prepared to soend real money on it, buy I can't buy a fusion, and you have no way of measuring my demand for fusion against my demand for lipstick. There is no 'fusion party' I could vote for either.

Also, the best way to suport green energy is to get a green energy provider rather than solar panels - its more efficient both in terms of physics and allocation of capital.


You personally have a demand for fusion but the market makers don't see enough of a demand or reward vs risk to invest heavily in it. When they see enough of a demand / reward they will invest.


"the market makers don't see enough of a demand"

Again, how do they know how much demand there is? What are they measuring? They don't, and they don't care.

Market makers did not invest in GPS, fission, touchscreens or hundreds of other technologies - governments did. 'Market makers' commercialised them.


This is called a market failure, when lack of communication and coordination makes optimal behavior not happen.


There are things people need but don't want, things people want but don't need, and things people think they want but neither actually need nor want. Demand is not a measure of what it useful, or rationally desirable.


That's absolutely not the issue. The issue is an economic system that prioritises what can be readily produced and consumed, rather than the most degrees of freedom that can be created. Whether in terms of general purpose computing devices, or education that maximises individuals capabilities, skills and life satisfaction.


The non-white foreigner and attendant social status thing is hopelessly confounded with the "developing countries have extremely corrupt academia" thing. Outside of certain edge cases, it doesn't matter in the West that you have a Ph.D. and some papers if you got them in e.g. China or India. Read a little about those systems: exam results are routinely bought and sold, papers fabricated, whole journals completely made-up.

edit

And I should be crystal clear because finding opportunities to perform being offended is extremely popular these days: many of these people are actually brilliant and capable, the problem is the credentials do not provide a signal.


I'm aware of that. My dad's a university professor.

In this case, though, the gentleman was from Japan, which is hardly a "developing" country.


Ah, that changes things. Japanese credentials can be a little inscrutable sometimes, at least in my field. Their connections with the outside world's academics aren't as strong for us. I know that's not true in e.g. math. Wonder what's going on.


I suspect it's largely a cultural issue. Japan isn't as outwards-facing as most countries, certainly most large countries. It's one thing to export pop culture, which it does very successfully, quite another to export academic products from such a different system. A system so different it probably doesn't descend from the original Academia at all, unlike most modern instantiations of scholarship.


What? The pursuit of knowledge does not require any intellectual hereditary descent from ancient Greece last time I checked. And the capacities of an academic culture is not correlated with closeness to the ideals of the ancient Greeks either.

They rise or fail on their own merits.


Modern academia doesn't even descend from any ancient Greek institution since schooling in Greco-Roman antiquity was far more of a personal engagement, universities were extensions of Christian monasteries and modern formal logic and 'rational debate' developed out of the practices of those monks communications with one another across distances.

Moreover it was the Muslim Caliphates and Eastern Romans who inherited and further developed the wealth of surviving antique knowledge between the end of the Roman Empire in Western Europe and Northwest Africa and the renaissance, but since Hegel Europe has had this lasting fascination with a direct link to antiquity.

Edit: An aside - People of Descartes' and Locke's generation were amongst the first waves of a secular break of this sacred academic tradition that was commensurate with similar breaks in politics.

It is this sacred origin of secular thought that John Locke is referring to in The Essay on Human Understanding in the form of "the schoolmen" he references as his constant opponent: these are the Christian scholastics, from Peter Abelard down to Aquinas, who developed the prototypical notions of universality/causation as we understand them today, but these were conflated things from the scholastics all the way down to Kant rather than how we now consider the former distinctly as an exhaustive, inclusive set of things in some category, e.g. being human. The categorical imperative doesn't mean "act as if everybody else had to also do as you do", it means "act as if your decision was inextricable from any other necessary, causal law we observe of nature, like gravity (does the Earth decide to pull you back down when you jump up?). Morality based on pain/pleasure is a BIG NO-NO for Kant and Kant's minimalist definition of being human excludes being able to be recognized empathically/sympathetically.


I'm not talking about capabilities at all. I'm talking about difficulty exchanging ideas when cultures are very different.


If the ideas are correct or built on a widely accepted basis, and can be reliably shown to be correct by third party investigators, i.e. in the hard sciences and engineering, the extra friction in exchange would hardly matter.

If you mean when the ideas are contentious than yes the language barrier and different culture would add noticeable friction and probably put japanese academics in a disadvantageous position in debate. Though that degree of friction seems to be independent of closeness to the ideals of ancient Greece, unless you know of some correlation?


Well, anecdotally... As a member of a youth program I spent quite a bit of time around elders of the native population of Canada. These cultures have a pretty sophisticated system of refining and passing on knowledge, which I found fascinating to participate in. I found that fellow program members who were very immersed in the western system of scholarship had a very hard time benefiting from the Native system of scholarship. Assumptions that come out of the Western system of scholarship, such as around the value of debate or abstraction, just don't translate well. It required a different attitude to learning, one I was probably more prepared for than my peers in part due to my time studying Japanese martial arts. There are cultural assumptions around how knowledge is discovered and perpetuated that are so deeply ingrained it's hard to imagine they'd be different in a different culture.

Now, the Canadian native population has lost a huge portion of its cultural base, they essentially have to adapt to the Western model. But Japan's system has experienced no such widespread damage. I don't know much about it, but I could see there being the same kind of deep differences with the western system that make moving products of thought between them difficult. Not just language barriers that make debate difficult, but assumptions such as the proper role of debate. The Academia is a common ancestor to most existing systems of scholarship today. Japan's system doesn't share it, so the water Japanese scholars swim in could be very different from the water the rest of us are used to.


Very true, but this still doesn't diminish the fact that India, and China still produce megatons of engineers, and scientists.

India, and China also provide much more opportunities for "natural selection" to weed out bad workers.

On other hand, the West has degree mills which will give one, and workplaces which will not test your abilities even to the minimum. And it will be the person's first serious job where the surprise will come.


Overall a good point I think. I would say that selection in the West happens at the university admission level, not really in university (they do everything they can to get you to graduate once you're admitted). But admissions are still, for now, a fairly good and informative filter.

Some workplaces, canonically software engineering, do test your abilities (I assume you mean in interviews?). There are plenty of logic games and practical tests. I agree most jobs don't do anything like this, and it's a shame.


Or, rather than a societal problem, it's a case of someone with specialized skills moving to an island nation which had no opportunities for people with such skills (poor planning).


I've seen plenty of engineers with 6 years education working on assembly lines in China.

An incredible boon for whomever was employing them. Having a person who can "debug" equipment problems on the line within minutes, or optimise seemingly innocuous, but critically important, and impactful small tidbits in the manufacturing process is priceless, and this is what makes the whole difference in between a failing manufacturing line, and one which runs well.


Well, what field and what journal? If he published in an Art History journal, it kind of makes sense. If he published in an elite engineering or Mathematics Journal, that's more surprising that he couldn't shift within STEM. But on it's face, Simply having a doctorate and a publication doesn't say much, you need to be from a useful field. Also, NZ is an island, wouldn't it be more wise to line up a job first before you move there?


I’ve never been to New Zealand, so I can’t comment. But Silicon Valley is full of highly educated, non-white people working in high-paying jobs. America may have an overly restrictive immigration system, but at least we know how to put our immigrants in to good jobs.


> but I always wondered how much the locals' perception of this individual - as a non-white foreigner - and his attendant social status played into it.

Honestly, a lot of foreign universities are... questionable to say the least.


This reminds me of this related Soviet joke: https://news.ycombinator.com/item?id=25857120


> doctorate and was a published researcher in his native country.

It reveals the actual role of these positions in society. They are not based on demand for a service, rather they are privileged positions awarded through a hierarchical selection process. In return, we get university teachers, which is the de facto service to society.


This is actually a problem in Canada.

Lots of people complain about the US’s visa sponsorship process. Basically a company has to offer a job.

Canada will give you permanent residency based on points. Whether or not a job exists.

Not saying one is better than the other - they both have pluses and minuses.


It's worse than that. Points are awarded for degrees. So foreigners just purchase fake degrees they can then use to immigrate. [0]

Of course, no licensing board/employer would ever hire them.

[0] https://www.youtube.com/watch?v=IHTg5zzFEKE


> he had a doctorate and was a published researcher

> as a non-white foreigner

What was his field of study, and where was he from? Some fields are more niche than others, and I think biases are (understandably) more specific than entirely skintone based.


Please details. Which country and what doctorate. Lots of doctorates are useless. No surprise here.


Or maybe his doctorate is from a country where someone can buy one pretty easily.


> Society seems to still have a severe problem with using talent effectively - whether that's elite experience or unskilled labor.

No, I do not think so. Because it seems to me a lot of this talent is either 'self-described-talent' or grade-inflated talent'.


I've heard similar stories from my own country (Norway). I get the feeling that we only accept educations (as valid or similar to our own) if they are from a handful of western countries. People with education from poor countries probably have to take the entire degree again, if they come here to work.


In addition to racial discrimination, there is privilege discrimination (on the basis of luck). This can affect you regardless of race but of course it's likely worse when combined with racial factors. Privilege discrimination can be quite overwhelming because the media doesn't talk about it and the perpetrators of it don't hold back at all.

Some ways privileged people discriminate against unprivileged/unlucky people (e.g. when competing for a promotion or investment):

- "They are too negative, we need someone charismatic." (In fact, PersonX's exposure to wave after wave of bad luck has made them a critical thinker, a realist, which makes them perfect for the job).

- "I don't like PersonX. They can't build rapport with stakeholders." (Because PersonX's has had real problems in their lives, they cannot relate to the artificial problems which members of the elite class constantly create for themselves).

- "They lack confidence." (In reality, what they lack is overconfidence; they have the ideal level of confidence/doubt to navigate a cut-throat marketplace; overconfidence breeds neglect which can destroy the company).

- "They don't have a track record of success" (Because they were always overlooked and never given a first chance to prove themselves; as a result, they've spent decades optimizing themselves and fixing every tiny flaw and gap in their knowledge and personal character; they are highly optimized and perfect for the job).

There is an inversion of cause an effect in the minds of privileged people which makes them think that they need confident people. In fact, confidence is more a consequence of luck than it is of competence. Nobody likes overconfident people. When you have power, you don't need any confidence; the less, the better. Power earns respect and on its own, it is the best substitute for confidence.


This feedback loop certainly does happen to some extent, but my intuition is that it's a small extent. At least in elite academia in the U.S. you get tons and tons of bonus points in every possible situation for being from an "under-represented" group (for some broken definition of "under-represented").

And yes, I benefit from this sometimes.


I don't think it's a small extent at all. The idea that confidence/charisma is a good thing is at the heart of global entrepreneurial culture. It has become so deeply embedded into people's mindset that they don't even realize that it's a bias. They think leaders need charisma in the same was that you need oxygen to breathe.

It's quite crazy though because so many charismatic leaders have turned out to be frauds; e.g. Elizabeth Holmes of Theranos, Adam Newmann of WeWork, Trevor Milton of Nikola, and countless smaller examples which most of us have experienced personally in our own careers. On the other hand, many of us have seen or experienced the effectiveness of more modest critical thinkers in leadership roles. Jeff Bezos himself admitted that "[people who were right a lot of the time] were people who often changed their minds."

Confident people don't change their minds because, in most settings, it projects a lack of confidence.


I'm not sure why you would think about entrepreneurship in particular, it's a small part of "elites".

I think the value of the personality traits you describe changes a lot with culture. For instance, I think the kind of confidence/charisma you're describing would go very poorly in Japan.

My intuition is that what you describe does happen, but it's a very small part of what's going on. I think that people who fall into this feedback loop are already psychologically predisposed, whereas there are plenty of people in the world who display this sort of confidence and charisma and came from nothing. My own internal model is 1st gen immigrants who run successful multi-store businesses.


I get what you're saying. It's true that some very lucky people manage to escape their unprivileged position early enough in their careers such that it is not a problem for them. Their career growth trajectory has given them a constant source of optimism which has given them confidence in their abilities and their future. That's because they believe there is a strong correlation between talent, work and success - That's the source of their confidence.

I think if you look at unprivileged, highly skilled, talented people who've been struggling to break through glass ceilings for a decade or two (in spite of delivering high quality work), then, you will get a better picture of what it means to be 'unprivileged'. These people are not confident in the future because they don't believe that there is a correlation between talent, work and success. Yet they are the most highly capable and they would have the best chance of succeeding if given the chance. But they probably won't be so lucky. They will be the most discriminated against of all.

The consequences of being discriminated against in the past will be used as a basis to discriminate against them in the future. "We should not give them the opportunity because they have no track record of financial success." (Nevermind that they do have a track record of excellence in their fields but the financial proceeds of their work were captured by others higher up in the hierarchy).


Thomas Sowell talks about this phenomena a lot, however he describes it as an abundance of wealthy individuals who are highly educated in philosphical ideas but who have no practical or useful skills for society.

It often happens in societies where hard work is looked down upon as something only poor people do. The result is that these 'elites' have nothing else to do but agitate politically.

I feel like people have a great need to succeed and further themselves in life. When they don't have an avenue to satisfy these needs through hard and useful work, they find other outlets. Often these other outlets that can be very detrimental to society as a whole.


> Thomas Sowell talks about this phenomena a lot, however he describes it as an abundance of wealthy individuals who are highly educated in philosphical ideas but who have no practical or useful skills for society.

I wish this was the only problem. I have been exposed to so many 'elites' (investors, diplomats, C-level executives) that are just plain stupid.

For some reason people assume they are very smart, perpetuating this myth.


> educated in philosphical ideas but who have no practical or useful skills for society

Like Lisp programmers.


In this survey, Clojure was associated with the highest salaries globally, and second place in the United States:

https://insights.stackoverflow.com/survey/2019?utm_source=so...


In my last job I moved from a team that had no "senior" level people on it to a team that was pretty much all "senior" and up level people. I was impressed at the high rankings of everyone on my new team and thought that first, I was about to witness powerhouse performances and a lot getting done, and second that on such an elite and senior team I would have a good chance of moving up the org chart myself.

What I found was that all the senior plus level people wanted to do was work on plans and reviews and architecture decisions and so on and so forth. We had a lot of "leadership" but very few people willing to actually put shovel to dirt so to speak. Our ultra-elite team was really unproductive.

I began to think of the senior people as something like salt. A meal may be best with a lot of food and a little salt. One or two high level people to plan for and guide a lot of lower ranking people who could actually do work. The second best configuration would be all food and no salt - people doing work without the high level insights and planning. The worst configuration, I discovered, was all salt and no food - just a ton of meetings, reviews, plans, and missed deadlines.

I'm connecting my experience with this blog post because it seems to me like the process of "becoming elite" causes the damage. Everyone on our software team was so busy trying to have cross team impact, show their leadership, come up with consequential architecture decisions, and the like - because all that stuff is what you need to be well regarded as a senior type person, that nobody wanted to do the more mundane tasks of actually implementing our plans. From the examples of the article it's like everyone is worried about the metrics they can control, like the academics publishing more and more papers in better journals, that they forget about actually discovering or documenting things.

To use the high variance language, maybe in the ur-company, when it was just starting, you could have done something big and consequential. If it was a success you'd get rewarded and become a senior or a principal or whatever. If it failed, even if it failed through no particular fault of your own, you wouldn't, or maybe you'd get fired. Nowadays, as the company has grown, there are fewer of those big risks and more people who want to be seniors and principals, so people take the low variance route - establishing that they are influential planners who wrote and reviewed a lot of stuff and so on, forgetting about the nominal goal of the company - shipping good products.


As an Ex-elite person, what you are missing is their performance goals as were given to them by their manager.

As a senior person in a big org, you are judge by your leadership efforts / cross team efforts, so this is what they tried to do.

I bet that if their goals were to write code, you would see them write code.


Exactly. Many senior-level people probably do want to write code, create things, and do that "shovel to dirt" work, but their company's performance review process discourages it. So, instead they have to fill their days with all that cross team impact, leadership, architecture, complexity, and process improvement stuff. Everywhere I've worked, "shovel to dirt" work is seen as what's done by the lowest rung on the totem pole. This mentality is all over industry. If you want to get promoted, you need to "create organizational transformation and synergistic thought leadership" instead.


a possible way out is to tie more of senior's compensation with equity, including senior manager's.

you don't automatically get more pay the more senior you become - you get more pay by increasing sales or products or customer retention etc, which increases equity value.

If everybody just plans and reviews, equity value will not grow.


If you're working for a FAANG or any company of similar scale, it's virtually impossible for an individual contributor to move the needle in terms of stock price. So the rational thing to do is plan and review with an eye to your own promotion, and let the busy bees in Ads lift up the stock price by churning out tens of billions in revenue.


Let’s say there are two ranks. Senior 1 and Senior 2, with Senior 2 being a higher ranking position.

Senior 2 will naturally come with more equity than Senior 1. The fastest way to increase your pay will be to be promoted from Senior 1 to Senior 2. You’re incentive will be to do whatever it takes to move from Senior 1 to Senior 2, which more often than not isn’t putting shovel to dirt.

Go to Levels.fyi, and look at how equity increases as you raise the ranks.


Seems that there was no manager and constant "architectural" discussions because everyone wanted only to show off how smart they are.

There are companies where no one from business side knows what is going on in development and they are not interested. They just pour money into dev and what they know it is only that "development is expensive and takes a lot of time".

I bet the goal was something like "just ship the system" which probably never happened or took a lot of time. Maybe even devs making their own requirements according to what they think is needed.

So I think you are too charitable with such interpretation that there were some goals or managers at all.


One counter example: the Go team, when it started (Rob Pike, Ken Thompson, Robert Griesemer, Russ Cox, Ian Lance Taylor), was quite senior. But they did deliver.


I'm pretty sure they went ahead and wrote the thing rather than generating a million white papers about it, though.

I suppose the 'thing' being prestigious to write is a differentiator, here. Put that team on AdWords with principal expectations for each of them and it's probably dysfunctional.


In my experience there are some genuinely elite people in the world who do genuinely elite things. However, most of the people with elite titles or rankings aren't such people.


Golang-team refused to add `math.min(x,y int32)` when I asked them with an internal Google ticket, around 2017. The company already had 63 separate instances of the function. I would add the 64th instance in my team's code. They said that's fine and closed the ticket as "Won't Fix - Working as Intended".

I wrote a golang WaitableBool struct for my team's code. I tried to submit it to the company's internal 'contrib' library, but golang-team rejected it. I pointed out that the company's C++ and Java versions of the struct were used in >150,000 places. Clearly there was demand and the Golang version would become widely-used. They argued that all Golang coders should always remember details of channel syntax and be happy to read five lines of boilerplate instead of "value.waitUntilTrue()".

I think their team was too senior. None of them were willing to add breadth to the standard library, despite massive code duplication within the company. They need a junior team member who can get promoted for growing and maintaining the standard library. We all need that.

[0] https://pkg.go.dev/math


An example from this year: Rather than fix problems with the golang.org website search box [0,1,2], they just removed it [3]. They ignored the low-effort workaround which is making the search box into a form that submits to Google Search. Is their whole team too senior to spend time maintaining a website search function?

[0] "Make website search case-insensitive" https://github.com/golang/go/issues/40217

[1] "Make search box remember what you typed" https://github.com/golang/go/issues/40218

[2] "show search box on smaller viewports" https://github.com/golang/go/issues/40220

[3] https://github.com/golang/go/issues/44357


On the other hand, I once had the opportunity to work in a small team like that, everyone else was senior with 10+ years of experience. It was probably the best experience of my career, incredibly humbling but I learned so much, we did in 6 months what I would have though only possible in one or two years.

My guess is that it can go either way and a significant part of it is having the right environment, the right management, probably some luck too.


Heh, at a previous company we had the opposite problem: relatively few senior people, and they almost all still wanted to get stuck into the nitty-gritty engineering instead of doing the work needed on higher-level coordination.


Reality sucks. Problems, emergencies, deadlines and complaints. Planning and designing is much more fun and 9-5.


Heh, my current team is trending towards all-senior swe, and we're all eager to eliminate as many meetings as possible to keep the focus on building shit and solving problems. Love it.


> the process of "becoming elite" causes the damage. Everyone on our software team was so busy trying to have cross team impact, show their leadership, come up with consequential architecture decisions, and the like - because all that stuff is what you need to be well regarded as a senior type person

Isn’t this the exact reason why Google has such a high rate of churn in its product lineup?


> For example, early in ancient empires, many rose in status via winning military battles, or perhaps by building new trading regimes. But later in such empires, status was counted more in terms of your connections to other statusful people. Which led to neglect of military success, and thus empire collapse.

I've always wondered if this was true, or if it's an artifact of the survival of records/histories that over-promote governing coalition members instead of non-governing but wealthly and influential citizens.

I.e., most ruling autocratic dynasties are founded through violence, so it makes sense that official records reference individuals who took part in this. Similarly, most autocratic dynasties are sustained through peaceful transitions of power to the next generation of governing members through the acquiescence of coalition members, so it makes sense that - over time - official records will reference individuals who succeeded through their connections instead of their military merits.

If you assume that autocratic governments are unstable over long periods of time, and if you intentionally ignore the large number of ruling coalitions that came to power through war but failed to survive a peaceful transition of power, the author's perspective here makes sense. However, this doesn't mean that it's necessary accurate.


The quoted passage seems strange to me, and is probably incorrect for the normal interpretation. If you were in a position to win a battle, say you were a general or led a large force of fighting men, you were probably already an elite of some sort. The only examples I can think of where this isn't true is either in modern times (US Grant being a possible example) or things like slave rebellions. But those are still the exception instead of the rule.


While it's pedantic, in sociology and related fields, status is almost always conceptualized in terms of peer-group connections anyway.

So it would be incorrect to say that General John Smith had high status simply because he was a General. Instead, John Smith's 'status' depended on whether or not he was a peer of other high-status individuals.

In other words, it was always a matter of 'who you know'. It's just that, as you pointed out, in the early stages of a new ruling regime or dynasty, being a General or holding a similar military position pretty much guaranteed that you had the 'right social connections'. This is true even if you gained those connections as a byproduct of your military rank.

This is likely true even in many or most modern states.


Surely there is some sort of acknowledgement of an elite class in societies that have disproportionate power. For example, the Spartans over the Helots, the planter class of Antebellum American South, the caste system of India, etc.


> The only examples I can think of where this isn't true is either in modern times (US Grant being a possible example)

War tends to effectively flatten societal hierarchy. When a society is in trouble the best people for the job tend to rise to the top because results are what matter and when the alternative is bad enough people don't care that someone doesn't have the right pedigree if they have the necessary skills.


Historically, not really. It was the norm for people leading the fighting to be some sort of elite. Think the nobility in middle ages Europe, the Spartans' control over the way more numerous Helots, the caste system of India, and the Samurai of Japan. There are probably exceptions I'm not thinking of, but for the vast majority of history, it was the already powerful that were leading the forces, not some plucky newcomer with the right stuff.


Nobody's realistically expecting anyone to go from peasant to general. That's an obtuse red herring.

What does happen is that everyone worth a crap gets their career fast tracked. People go from machinist to plant operator or from junior officer to general in a year, etc, etc. People who would have been promoted because they know someone are passed over because when things are serious like in wartime people feel they cannot get away with forgoing more promising people for those kinds of people.

People who would have had to work a whole career to break between class levels can do it in a fraction of the time.


Look, the article is making a historical argument, and your talking about the modern ways wars are fought. I believe the author has an incorrect belief about how historically battles were won, and I pointed to several historical examples. Now, you can either refute my examples or provide your own, but pointing to the way the military runs today does nothing for my argument that historically "winning a battle" was not a great method of social mobility yo "elite" status.


Wars bring huge amount of corruption with them. And they involve huge amounts of bad decision making too. It does not sound to me like historical argument.


Brave New World hints at this phenomenon.

Mustapha Mond smiled. "Well, you can call it an experiment in rebottling if you like. It began in A.F. 473. The Controllers had the island of Cyprus cleared of all its existing inhabitants and re-colonized with a specially prepared batch of twenty-two thousand Alphas. All agricultural and industrial equipment was handed over to them and they were left to manage their own affairs. The result exactly fulfilled all the theoretical predictions. The land wasn't properly worked; there were strikes in all the factories; the laws were set at naught, orders disobeyed; all the people detailed for a spell of low-grade work were perpetually intriguing for high-grade jobs, and all the people with high-grade jobs were counter-intriguing at all costs to stay where they were. Within six years they were having a first-class civil war. When nineteen out of the twenty-two thousand had been killed, the survivors unanimously petitioned the World Controllers to resume the government of the island. Which they did. And that was the end of the only society of Alphas that the world has ever seen."


Elite overproduction is not as big of a concern as the hype would seem to suggest

Despite record high college attendence, the college wage premium is fatter than ever, and unployment for grads is really low at just 2 percent, the baritista meme not withstanding.

The majority of grads major in actionable fields with good career prospects, not journalism or the humanities. Liberal elites who write highly publicized articles about matters pertaining to gender studies or politics are of the minority of majors. Most grads just want the requisite credentials for a middle class lifestyle, not to change the world or rise to the highest tiers of power.


> the college wage premium is fatter than ever,

The more accessible (and popular) college is, the bigger signal it is that someone didn't go. That premium could just as well be the result of the (relative) wage floor falling out for non-grads.

No comment on the greater thesis. Just noting that those two things might not relate.


Absolutely. FiveThirtyEight has an excellent article [1] from 2016 that reiterates what you say here. In short: A lot of the tropes about college elitism are bogus. (But instead of relying on conventional wisdom to support their argument, they use data, and some might find that too elitist.)

[1] https://fivethirtyeight.com/features/shut-up-about-harvard/


Why do the tropes persist? Do the people writing these articles not know anyone with an accounting degree from a state school?

It does feel like people like me are forgotten, ignored and dismissed by these cosmopolitan selective school elites.


Memes. Ideas get into the water of society and get spread without re-examination.


The only time you hear these people speak is when they tell you how much they saved on tuition. So they are pretty rare.


> and unemployment for grads is really low at just 2 percent, the barista meme not withstanding.

That 98% number includes barista employment.


If that's the case, then why is student debt such a problem? Or is that equally overhyped?


Student debt is structurally not the same as other debt because it's non-discheargeable and this changes a lot including who is offered the debt in the first place.


The latter difference is a good one in a great many cases. If asked to name two major factors to elevate people into the middle class, I’d choose education and ownership of their home.

Policies and structures to allow people on the precipice access to funding required to take the leap and which attempt to keep the system stable by putting personal “skin in the game” find support with me.


The wage premium holds even after accounting for debt


> unployment for grads is really low at just 2 percent

Not long ago it was below 4% for all workers, how recent is that number?


> For example, early in ancient empires, many rose in status via winning military battles, or perhaps by building new trading regimes. But later in such empires, status was counted more in terms of your connections to other statusful people. Which led to neglect of military success, and thus empire collapse.

Is this being said seriously? As someone that once aspired to be a historian this sounds like a massive oversimplification. But maybe this is a hyperbolic rethorical resource. It does sound Toynbeesque though, so maybe the author is taking this from Toynbee's work?

I feel like these examples/arguments/proofs need to be developed a lot more to be consistent, specially for an article hosted in a domain called "overcomingbias.com".


On the philosophical level, gotta love the unashamed "starting wars is a good thing, never mind the pain, destabilization and suffering you caused" in that sentence.


I have been seeing more and more press hits for "elite overproduction" recently. It stinks of an ulterior motive: the middle class has been under withering financial attack for decades and I fear is now weak enough that the overlords smell the opportunity for a frontal assault on the concept itself.


The middle class has been bifurcating, with more rising into the upper middle class than falling into the working class. Where would you place the 60% or so of the part of middle class that "disappeared" by rising into a higher class? Are they the barbarians at the gate who need to be pushed down by the elites, or are they part of the power structure now?


I draw the lines based on the taxes. Those are a good proxy for true power.

If the bulk of your tax bill comes from special capital gains rates on the fraction of your net-worth-increase that you are ignominiously forced to admit constitutes capital gains, you're an elite. If the bulk of your tax bill comes from a higher income tax tier, you're middle class. If the bulk of your tax bill comes from a lower income tax tier, you're correspondingly lower class.

Needless to say, the principle that taxes reveal true power does not lead me to agree with your premise that 60% of the middle class has disappeared into the upper class.


Just to clarify, I'm saying of the small percent of the middle class that disappears every year, slightly more than half rise while slightly less than half fall. I acknowledge it was unclear, but I don't mean that 60% of the entire middle class is now upper middle class.

https://www.brookings.edu/research/squeezing-the-middle-clas...

> the main reason for the shrinking of the middle class (defined in absolute terms) is the increase in the number of people with higher incomes


A non-founder CEO (not a Zuckerberg/Bezos type that owns a large fraction of the equity in the company) that makes $20 million a year will probably be paying regular income taxes on most of their income. I’m not sure you can accurately describe them as middle class.


If this hypothetical CEO indeed pays regular income taxes on $20 million a year, I'd describe them as new money that hasn't yet assimilated into the power structure but will probably figure it out soon enough. With help, if necessary, and at those levels the help will seek them out.


This is my feeling as well. I might be concerned with "elite overproduction" if the middle class hadn't been shrinking for decades.


Not mutually exclusive, 'elite' never meant middle class, and you used to be able to attain a middle class lifestyle without having to be 'elite'.


Right. That never stopped being the case and should never stop being the case. The concept of "elite overproduction" wants to trick you into thinking it did.

The "elite overproduction" framework begs you to not view the middle class as a group of people who have achieved a balance where they do valuable work and are rewarded with nice things, but as a group of scheming do-nothings who pose a burden to society as they foolishly aspire to elite-hood and repeatedly fail. It's a laser-guided propaganda missile aimed directly at a fault line in populist movements that have been gaining steam recently.


Let me offer a defense of the meme, because I think you've got the causality backwards. I think that because I'm of an age where, had I gone straight to university after high school, I'd have graduated straight into the teeth of the 2008 recession. So I know how my cohort thinks about plight of the middle class. And almost everyone my age or younger, given a chance to think about it, agrees that the middle class cannot be saved. It's considered a lost cause, and among those who do believe it can be saved, their arguments amount to a blinking confusion and insistence that it must.

Given that people like me have only ever participated in the economy during a time when the middle class was shrinking, what are we to do? Most of us agree, if our future is a world of a few 'haves' and many 'have nots', priority one is to become a 'have'. Even from the most moral stance imaginable, a 'have not' has no ability to help others. Hence, too many of us try to get into the elite. To aim for the middle class is to aim for grinding poverty, and why would anyone?


That's an interesting thought, I'm not sure I agree but I'd have to think about it more to differ.


I don't see that in the "elite overproduction" framework. I see a bunch of desperate kids trying to stay in the middle class that their parents supposedly were in. You need an increased level of specialization to stand still.


I agree, and I think non-elites should also have more access to middle class earnings and status. They don't, and many of them are also frustrated at the lack of opportunities they face. But if I were to say that we have a non-elite overproduction as a result, that would sound stupid, because it is.


People write about the stuff they read.


People also promote ideas that benefit their interests.


You sound like a "disgruntled" and "disappointed" "status seeker". /s


:}


Seems like less a problem of 'elite overproduction' and more a problem of relying on hiearchies when a society reaches a level of success that you begin to create many exceptional people.

Maybe in some societies it's time to start working to break hierarchies and cease training people to operate within and climb them. It's time to redistribute the wealth, because if we don't the 'excess elites' will rationally hoard it -- leading to civil unrest and ultimately collapse.


I wouldn't phrase it that way, but I think I believe something very similar coming from a different cultural background. I'd be curious whether you would agree that the issue isn't so much any hierarchies at all, but too few of them. I still want to reward the best podcasters or authors or aerospace engineers, the issue is when too large a portion of society rewards too small a number of people for too broadly-defined a set of criteria. And elite overproduction is sort of when too many people want to be at the top of the biggest hierarchy for poorly defined reasons, rather than being the best author they personally can be.


Absolutely agree that it's wise to reward and give special attention to the most skilled people.

I agree with with your perception of the problem. It makes good sense, and this framing could be used to guide reforms you might compare to antitrust law. Do you have anything in mind that could be done from a policy perspective? Because I don't think that asking people to turn inward for validation is going to work.

Let's say we find some policy that works and we can enact. I'm not sure that's enough to fix the problem. Just as policy encouraging a competitive market hasn't fixed the problem there, either. It will keep pooping up and finding new ways to subvert the policy.

I think that just as capital creates capital, hierarchy creates hierarchy. Whether it's human nature or the structure of our societies, the outcome appears always to be consolidation. If the efficiency gains of consolidation cannot counter the social/civil instability, and I don't think they can at a certain level of development, then we should tackle the root cause. But I'm repeating myself.

I appreciate your perspective, thank you for letting me see it.


It's a difficult question, and I wish I had a better answer than 'read laloux'[1]. But I can try to summarise the perspective I gained from that exercise. On a policy level, anything that reduces the amount of power concentrated in any single hierarchy would probably help. But I agree policy is probably not the best place to make any lasting changes. Culture seems far more promising to me, if the cultural assumptions are more around becoming the best at a new field over competing in a crowded one, or about helping your junior colleagues make good decisions over making good decisions for them, then I think we'd be getting somewhere.

Obviously, that is a slower thing to change than anything that could be done at a policy level, but I think it gets us towards the right metrics. On an optimistic note, I do think that organisations which make more use of all members' judgement have advantages over those who concentrate decision making - here's a general making that same point in a military context [2]. As for where the points of maximum leverage might be to achieve that kind of change... I'd certainly be interested in any suggestions!

[1] https://www.reinventingorganizations.com/ [2] https://www.youtube.com/watch?v=i7B5pFSq7XA


Really interesting concept which has a lot of explanation power. I work with some of the people behind this idea on a global historical databank called Seshat: http://seshatdatabank.info/. Idea is to record all of the political and social datasets from human history (big parameters!) and provide in a machine readable format. Basically trying to understand the past (where elite overproduction occurred etc.) and make data driven predictions about the future. Uncertainty, complex schemata, complex data, and difficult to digest formats abound. The project was the starting point for TerminusDB: https://github.com/terminusdb/terminusdb


What could falsify this hypothesis? I feel like I can tell just-so stories about the all-consuming importance of elite status in the post-war United States that are just as shocking and compelling as what is written here. Did you know people used to smoke cigarettes to look cool and keep their weight down? That men working in white-collar jobs wore suits to work to distinguish themselves from the lower-classes (hence the name "white-collar")?


I think the problem has more to do with continued application of filters based entirely on one's ranking according to some metric. The problem with this is that we never have a measureable metric for the goals we actually care about. For example, universities want the students who will succeed the most at university and thus use some combination of grades, test scores, extracurricular participation, etc. as a proxy for future success. However, future success is a vague goal and the measures chosen would often fail as predictors even if the objective were formalized due to a combination of chance and various biases of the predictor metrics. This gets disastrous in the long term because all these filter processes reinforce each other's biases and due to people being filtered trying to game the system. I think to fix some of these issues, we should stop trying to rank everything precisely and introduce some degree of actual randomness in who passes a given filter. For example, instead of passing the top 10% of applicants in the old system, still keep the top 5%, but do a weighted choice for sufficiently qualified people to fill the other slots. I think this would also help improve diversity as well without the use of true affirmative action.


The fundamental problem is work piling. The unit of work isn't 1 hour, it's a "job". If there is 60 hours of work then one person will get a whole job while the other one struggles part time. Both could have gotten a 30 hour work week. Now they start competing for the 40 hour job.


I think we would keep the top 5% scores, and the remaining 5% would be filled with the people able to "affect the randomness" (read bribe) the best.

There is already a degree of randomness inherent, since for example students don't all study all the material with exactly the same level of attention to each individual part of material. And some will be tired the day of the exams. Some will have something else on their mind while studying due to their relative being in the hospital. Some will by chance happen to guess right on some difficult task. etc. So you already have randomness.


That randomness doesn't correct for score bias.


Interesting article and ideas. I wonder how well we can truly know how the thoughts and motivations of “our generation” compares to “past generations” though. How would we _know_ that people today are more worried about status than people in the past? How would we _know_ that in the past merit carried more weight than “noble blood.” That doesn’t seem to me like something that can be quantified, so I don’t see how we can confidently make comparisons over time in any scientific sense, only in an anecdotal and philosophical sense.


Edit: Regrettably, the entertaining quote I copied below seems to be completely misattributed, as seen in the comment below.

This is a quote that I put in a sibling comment that might entertain you. The "kids these days" complaints have been a consistent claim made by each adult generation since antiquity https://www.goodreads.com/quotes/63219-the-children-now-love...



Yet another problem with that quote is that generations are in fact different. Some of them are more violent then others. Some of them drink more then others. Some of them are more submissive then others. Sometimes, kids are raised to be assertive, other times they are raised to obey.

This quote is used to devalue anyone who points out some difference between now and then. But, it does not prove that such person is wrong.


In the past, "noble blood" was the metric of merit.


History studies do deal with that sort of thing. But, one would have to deep down into it and go on reading it in order to learn it.

The big problem there is that real historians never talk about "past generations" as a whole which is nonsense concept anyway. Historians talk about "1735 English aristocracy" or "1865 lower class women in American north" and so on. Because how the society is organized constantly changes.


> For example, early in ancient empires, many rose in status via winning military battles, or perhaps by building new trading regimes. But later in such empires, status was counted more in terms of your connections to other statusful people. Which led to neglect of military success, and thus empire collapse.

Is this really a good example? Building an empire and holding it are different things. Many empires collapse from succession disputes and rebellion rather than a lack of military success over rivals.


I get perpetually annoyed by Robin Hanson’s inability to properly grasp this concept. Peter Turchin’s books are clear about the concept and provide objective ways to quantify elite production, yet Hanson perpetually ignores them.


I discovered Robin Hanson around 3-4 years ago. I read one of his books, through his blog, and his twitter musings and polls.

Often he would question something fundamental about human interaction that seemed to me (and I suspect most) to have obvious and quite simple explanations. But he is intelligent and couldn't be missing the forrest for the trees I told myself. Clearly he must be operating many steps ahead. Levels above!

Now I wonder if maybe he just isn't so good with trees.

I do not mean to dismiss the entirety of his work. No doubt much of it has been valuable. He seems to struggle to understand people, in much the same way as one of the alien's he's obsessed might struggle.


"Elite overproduction" is real, in that only about half of college graduates now work at jobs that really require a college degree. The article seems to have gone off on some other subject entirely.

This isn't a new phenomenon. Eric Hoffer mentions it in "Working and Thinking on the Waterfront", written in the 1950s. He is unimpressed with intellectuals who don't do real work.


> Here is my related hypothesis: we now put more weight on many smaller lower-noise status markers, instead of fewer bigger noisier markers. In particular, we put more weight on markers of connections to statusful people and institutions.

At first I didn't agree with OP's claim that there is more visibility of status. I don't think that is entirely correct: I always knew my family was poor, and I knew who the rich kids were, or who the top performers were. But when I circled back to the top of the article I realized it makes sense: 40 years ago in my childhood it was rich/not-rich, smart/not-smart, hot/not-hot. Now there are dozens of metrics for ranking oneself. I agree that overload of negative data is the anxiety driver.

It's like we're all being forced to become data scientists and numbers junkies. We've metricked the hell out of our well-being. At least for this generation. Aren't we infinitely adaptable? Or will future kids just have shorter life spans due to the social stress?


You can't win at this game. You can only lose. Especially these days, with the "dozens of metrics". No matter how well you do at one (or several) metrics, there are more at which you fail to excel. If you measure your worth by your social standing, and your social standing by status metrics, you lose.

So, don't do that. Don't play a game at which you can only lose.


So the only winning move is not to play? A strange game, indeed.


I think the term "elite" here may be the root of the problem. There are implications that educated people are meant to "rule" society which may not be true (in militarism, true democracy etc.), or that they have skills with no useful applications. Maybe we could try "intellectual workers"?

Even traditional humanities can be used to please people, stimulate them intellectually, organize knowledge and persuade (rhetoric). These may not be trivial to monetize in a market but are still stuff that a developed society needs. Also note that nowadays we don't need boatloads of people to cultivate land with ploughs and sickles and hand produce everything. Maybe the society isn't well organized but it doesn't mean we don't need educated people (indeed, perhaps better educated than modern university system can offer). I think this blog post has some interesting points on history of humanities education for outside professionals: https://scholars-tage.org/for-god-and-progress-notes-on-trai...

Besides I really don't like attempts to solve history by armchair reasoning, instead of empirical evidence. Like the first "theory" of empires failing. When it comes to Rome, there are some examples showing that indeed there were less "great people" war/statecraft accomplishment, but the more apparent (even to the contemporary people) and specific reason being moving to empire from republic. You really didn't want to be too successful because then the emperor would see you as a threat.

There is this public desire to have an unified theory of how civilizations (or "empires", lol) decline. I don't believe we have enough of a sample size, even if we had sufficient historical sources. But there is always unrigorous thinking leaning towards whatever social views the thinker has.


I wonder if we would hear so much pearl clutching about “the elites” if every journalist didn’t believe they were one.


Seems like the obvious is being overlooked? If elites are being overproduced, it suggests that whatever is being used to justify that status isn't warranted, and therefore, any inequities are unfair or unjustified. You can move the goalposts but if it's in the wrong direction you just make the problem worse.

This might be why the "class grades don't mean anything anymore" rings hollow to me -- it's trying to close the barn door when the horses are already gone. Or to use a different analogy, it's like some variant of the fox and grapes fable (maybe there is a fable for it on its own?), where if the elites' markers are shown to be empty, they start saying "well those weren't good anyway."

That might work for awhile but if there's enough of it people start thinking the system is rigged.


This is actually the point of the elite overproduction hypothesis, although the article doesn't get that across. People who pose the theory don't believe that actual elites are being overproduced, but people who believe they should be elite and act as though they are.

Many people with a degree believe that they should be elites (because they would be in the 1960s) and are disappointed that they are not. The elites have circled the wagons and use friendships to determine eliteness (see WeWork and Theranos - where elites started companies that are total frauds and still were billionaires for a while).

At the same time, it is good for society to have a meritocratic component to the measure of "eliteness." This is why generals who won battles was a good example, and college degrees used to be: people who became generals or got into college were generally not there purely on merit, but the cream of the crop became apparent through challenges relevant to the society of the day.

The meritocratic component also allows for social mobility for people who are really damn good at whatever the merit is: great soldiers could find their way into the roman aristocracy from nothing, and colleges always admitted a few exceptionally talented lower-class people who became great scholars. The "elite overproduction" theory suggests that there is too much social mobility for the wrong people, and not enough for the right people.


> Seems like the obvious is being overlooked? If elites are being overproduced, it suggests that whatever is being used to justify that status isn't warranted, and therefore, any inequities are unfair or unjustified. You can move the goalposts but if it's in the wrong direction you just make the problem worse.

The problem is there's no consensus on what's fair. A lot of people think they deserve to be part of the elite, and they have reasonable-to-them grounds for that. The people saying "grades don't mean anything" are probably sincere - they really do think something else is now more important than grades and that's why they focused on that something else. But it all adds up to something far too top-heavy.


This is complete and utter nonsense. In a time of incredible, increasing inequality, the problem is that there's TOO MUCH social mobility? Give me a fucking break.


Producing graduates with irrelevant education and a huge debt is the opposite of social mobility.


It's the debt that's the problem, not the education.


Colleges used to be finishing schools for the rich attached to educational institutions for geniuses (the geniuses keep the prestige level up, and the rich provide funding). Companies started demanding degrees to get people who are geniuses or well-connected, so new schools popped up to produce more graduates, and so on in a feedback loop. Eventually the degree became a requirement and not a signal of "eliteness" any more. The people getting the degrees still feel elite, even though many go on to become baristas and store clerks.

The problem is that college was never about the education (except for a small minority of students who were usually in the sciences), it was always about the signaling. The part that we think is good (the education) is just a side effect.


It's the debt for irrelevant education. You don't need a million Pokemon scholars, and society would do much better if they had gone into engineering instead.



I strongly object to the terminology here. These people are not elite, they are good at gaming the current socioeconomic system and/or have some advantage gained by family relationships or proximity. They are not the smartest, or strongest, or any X adjective. If they are smart or strong, this is only somewhat correlated with their societal position.


I know nothing about Social Science, so maybe someone can chime in. Whenever I read about something from the social sciences, it always seems to read as an opinion rather than something rigorously tested.

Is Social Science considered a rigorous science, or will we look back in 100 years and group it in as we do with phrenology?


Considerations:

1) First, questioning the premise—you may encounter a whole lot more pop-social-science writing than the Real Deal. Consider, for example, what a smart and thoughtful person, whose main exposure to the work of historians was watching Ancient Aliens, might think of history, as an academic field. I don't know what you've read, but it's possible a lot of it's way toward the entertaining-but-wrong end.

2) As for "rigorous science", the social sciences have many limitations that something like chemistry or physics do not. They are necessarily based much more on observation and comparison of complex real-world systems. Limited ability to perform controlled experiments doesn't make them not science, as there's more to science than just that, but it does mean it can be very difficult to prove solid, definitive results in social science. It does also mean that the edges of social science are a great place to hide and nurture some serious bullshit, and if it's politically convenient for someone maybe even get your bullshit a few well-funded think-tanks—that in addition to how much the heart of it's full of bullshit, too, due to careerist game-playing by researchers, but that's not so different from other fields.


> I don't know what you've read, but it's possible a lot of it's way toward the entertaining-but-wrong end.

Not quite... I've gone through the classics, Wealth of Nations, Das Kapital, General Theory of Employment etc, and about to start Ricardo. What I've found is although they sound logical and use many examples to make their case, I still see most (with some exception to Keynes) of these are just opinion with possibly selective bias of examples.

What's bad here, is that these authors each take themselves as gospel, which then politicians have implemented these ideas with a blunt hammer.


Ricardo's going to be a fun one, then. I think you'll find he doesn't make quite so broad a claim re: the benefits of free trade as free-trade advocates tend to imply he does, in his study of comparative advantage. (there's a trend here, since similar things happen with Adam Smith)

There is a divide between political philosophy (and adjacent economic work) and social science proper, that's analogous to the divide between the theoretical side of modern physics and the experimental side but cranked up a few notches. As in most other sciences, journals, low-print-count and narrowly-scoped books from academic presses, and (to some degree) textbooks, are where the scientists really work (this doesn't mean it's all good work, of course)—though the standard reading (which you seem to be working through) is standard for a reason, and guide the vocabulary and context of that work.

> possibly selective bias of examples.

Heh, when you're lucky there are a good number of even biased examples. The really painful ones are mostly "suppose if..." fairy tales spinning fantasies based on some questionable premises and definitions, the odd bad syllogism, and a very small number of cherry picked examples that, if you dig, aren't even presented accurately. Whole schools of political and economic thought can be based almost entirely on this, as their adherents sneer at the illogic and idiocy of those who aren't convinced.


> There is a divide between political philosophy (and adjacent economic work) and social science proper, that's analogous to the divide between the theoretical side of modern physics and the experimental side but cranked up a few notches.

Its not really that analogous. Theoretical physics is at least in theory still predictive, though it may be vert distant from the pragmatics of verification. There are areas of political/economic science that are like that, but political/economic philosophy is predominantly normative rather than predictive (Austrian school economics, despite being called “economics” rather than “economic philosophy” and despite occasionally making fact rather than value claims is, and fairly openly admits to being, part of this.) There's a lot of people who are active as public intellectuals on both sides of the predictive/normative world, and the media (who are often the employers of public intellectuals) do an extremely poor job of explaining, when they aren't straight up concealing, the divide between predictive and normative elements of controversies in political and economic debates.


> Its not really that analogous. Theoretical physics is at least in theory still predictive, though it may be vert distant from the pragmatics of verification.

Sure, I intended the "cranked up a few notches" to cover the difference, but could have been clearer. Think things that don't even have a clear path to verifiability or testability, but continue to generate papers (string theory, maybe) but even more speculative.

> There are areas of political/economic science that are like that, but political/economic philosophy is predominantly normative rather than predictive (Austrian school economics, despite being called “economics” rather than “economic philosophy” and despite occasionally making fact rather than value claims is, and fairly openly admits to being, part of this.)

Absolutely! Part of the trouble is that to chart a course in politics or economics, you have to choose a desired outcome with winners and losers, so there's inherently a normative aspect to it. There's a whole lot of "ought" involved that's practically unavoidable. Throw in conflicts of interest and it's a hell of a mess. It's like if no-one had been able to agree on where Apollo 11's Eagle should land, and also they're already in flight, and also gravity doesn't always do what we think it does, and also several of the people involved are being paid to shill for particular landing sites, even if the spacecraft can't physically reach them within its fuel budget, but they've got some really fancy-looking explanations about the whole "gravity is weird sometimes" thing so maybe they're not full of shit?

To pick on the Austrians in particular (hey, you brought them up, not me, I only posted a complaint about their broad political sphere's style of argument without naming them :-) ) they've got a bad case of presenting their ethical egoism as also satisfying consequentialist approaches better than any other framework, just by happy coincidence. This should set off one's alarm bells at full volume, but it sure does help them cast a wide net when evangelizing.


Thanks for the comments!

Maybe Carlyle was right after all - it really is the dismal science :)


> Is Social Science considered a rigorous science, or will we look back in 100 years and group it in as we do with phrenology?

I think there's a middle ground between being a rigorous science and being an anti-science like phrenology. I think there's no problem with having soft sciences, as long as the difference between them and hard sciences is understood, and one does not confuse the levels of rigor and certainty between the two.


I think the bluring of lines is what I'm worried about.

There's a lot of soft science when it comes to policital economy, and countries have gone all-in with policies from different branches, even though there's really no hard evidence and rigor of some of these ideas. It's mind blowing when you think that millions of people's lives are affected because politicians have had gut instincts of a soft science sounding scientific.


These are fundamental misunderstandings that I often see on HN. First, world leaders must make decisions about the economy, even if those decisions are to, say, permit a more free-market economy. They have no choice in the matter but to decide. The choices they make are based largely on what they consider the best evidence available to them at the time. Second, everyone who participates in the economy has a mental model of how that economy works, using their own evidence. Anyone who says they don't is lying. So if your model of how the economy works is better than the ones they are using, then show us your evidence. Pointing out the flaws in their model is easy and insufficient; you need something better to replace it with, and you must be able to sell it.


Sorry, I think you misunderstood me...

> The choices they make are based largely on what they consider the best evidence available to them at the time

But they're not doing this, and this was my point. A lot of political economy is opinion rather than based of fact, and these non-fact based opinions are what politicians are choosing over hard evidence.


Not sure whether you're suggesting that decision makers don't incorporate mainstream theory into their decisions, or whether mainstream theory is just opinion and not based in evidence. I think either statement would be wrong.


> mainstream theory is just opinion and not based in evidence

An example here would be leaders making Communism a country policy.


Agreed. Unfortunately, a concerning percentage of the population seems driven to treat it as the former when defining their core beliefs, like castles built on sand.


This is blog post.


When the typical economist tells me about his latest research, my standard reaction is 'Eh, maybe.' Then I forget about it. When Robin Hanson tells me about his latest research, my standard reaction is 'No way! Impossible!' Then I think about it for years - Bryan Caplan


Bees some of the most organized forms of life on this planet.

Bees have different roles, among them, the queen bee. Bees have the ability to create queens by feeding them with royal jelly. When for any reason there's more than one queen, queens fight to death.


Elite overproduction is already real in developed countries with high pressure on education, as South Korea. But there is a difference between highly capable people and elites in the sense of dominance structures.


“Here is my related hypothesis: we now put more weight on many smaller lower-noise status markers, instead of fewer bigger noisier markers. In particular, we put more weight on markers of connections to statusful people and institutions.”

I wonder if the writer is familiar with the class system? Or noticed the democratisation of connection through technology? From a historical perspective, this part at least seems backwards to me. At best, the “now” is irrelevant and there is either no such trend or it has changed more form than quantity.

Made me suspect that the rest is over-thought.


From historical perspective, another side effect of "elite overproduction" is that as "elites" grow in number, they take an increasingly larger slice of the pie, eventually leading to peasants rebellions when enough people cannot even feed their families.

Not sure how it applies to modern society, as we seem to manage food scarcity much better than previous millennia. And disgruntled elites alone is not enough to start rebellions if they lack popular support.


> For example, early in ancient empires, many rose in status via winning military battles, or perhaps by building new trading regimes. But later in such empires, status was counted more in terms of your connections to other statusful people. Which led to neglect of military success, and thus empire collapse.

The more I think about this statement, the harder it is for me to actually come up with any examples of this happening. While military success of a general might well be useful for status, it is hard to find any examples where that's the primary source of status. The closest I can think up are steppe confederations, whose great size (when they attained such a size) was driven in large part by the personal charisma of the leader, and the loot they could bring to the table, and almost invariably fell apart the moment said leader died, even if their successor were tremendously militarily successful (such as Kublai Khan's conquest of China).

I can come up with some more examples where the military caste ultimately robbed the polity of military success, but these came about because the military caste was unwilling to adapt to modern military technologies and opposed military reforms that might lessen their status. And even then, military nonsuccess didn't lead to empire collapse very quickly (cf., the Ottomans).

> So early on, ambitious soldiers tried to figure out how to win battles, and to get involved in promising battles. But it was hard to guess just how to do this, and outcomes were noisy functions of efforts.

... do you know anything of military history? Because the key to success is invariably training. Whether we're talking about the citizen soldiers of Greek city states (where soldiering was a part-time job) or the professional army of Rome (where it was a lifelong career), soldiers trained. Hell, the sports of elites (e.g., polo, fencing) are basically military training in disguise. And if you're not talking about common soldiers but elite generals, that military training will absolutely include studying the great generals of yesteryear to be able to apply their tactics and strategums when the time comes due.

> So no one could be very sure of their future status, or with whom to associate to gain status. But later on, ambitious soldiers would need to come from the right family, and make good new social connections.

Is the implication meant to be that common soldiers uncommonly good at soldiering could leverage that into high society? Because... again, I'm hard-pressed to any time that was true. Even in cases such as the Aztecs, where military prowess was a prized status symbol and was theoretically open to anybody (and everybody received military training!), in practice, actually getting yourself in a position to demonstrate military prowess required elite connections. If we instead restrict ourselves to elites whose power derives from their demonstrated military abilities, well, most such people you can point to could only do what they did because of their connections. Alexander the Great was the son of the guy who conquered all of Greece. Julius Caesar was the scion of one of the most powerful families in Rome, and even after he found himself on the losing side in a civil war, he was able to get plum military and political appointments by attaching himself to the richest man in Rome.

So I again ask, what is this society of which the author speaks? It doesn't match with any empire, ancient, medieval, or modern, of which I am aware.


In Huxley's Brave New World, there is a scene where the savage asks the World Controller, why don't you make everyone Alpha Double Plus?

The Controller explains how a society of Alphas couldn't help becoming restless and unstable, so instead they produce just the right social pyramid for a happy society.


Well until we can understand that social hierarchy and social status are pointless values, this will continue to generate problems.

What is the point of considering that there are people that are worth more than others? Merit already doesn't work, only inheritance and social reproduction really matter.

Why are there so many people who disagree that 100% of people should have access to healthcare, food and shelter? Elites shouldn't really exist. Nobody is really above anybody else. That's the point of having human rights.

Of course some people are more specialized for certain things, but nobody is really "above" others. At best, leaders or people with responsibility are "accountable", but their efforts are done as an effort to help other, not to reap the benefits.

Market fundamentalists are going to argue that risk must be rewarded, but investors will always work hard to lower risk to a minimum to increase ROI, so why are they really rewarded at all if they reduce risk, since they argued risk must be rewarded?


This isn't so much about basic needs, which our societies generally provide. This is about having more than a minimum standard of living: seaside mansions aren't as abundant as 300 sq ft apartments. We're apes who are wired to show off to other apes to attract mating opportunities and secure the future of our genes (by helping our children get a leg up).


no, that's a fallacy of nature


It isn't that this behavior is good because it's natural - it's good because it keeps people motivated. The problem is that unfortunately, people are not the same. Some are better physicists than others, some are better organizers, and some are better carpenters. You need some inequality of outcome to incentivize people to do what they are good at - the Soviet Union has showed us that.

Having any inequality at all produces elites. You can have human rights while having inequality, and we generally do.


> it's good because it keeps people motivated

But what is motivation, really, and do we really need it? https://www.youtube.com/watch?v=u6XAPnuFjJc short answer: motivation is not what we think it is. Does motivation matter more than happiness?

> You need some inequality of outcome to incentivize people to do what they are good at -

Most of those people are paid about the same anyways. All professions should be paid about the same. I don't see why inequality is necessary when so many are paid the minimum wage. There is no job that is more worthy than another.

> the Soviet Union has showed us that.

The soviet union was a political disaster. Corruption will cause problems in any economy. There are no proof that communism is bad because the soviet union was corrupt and collapsed.

Even the notion of merit doesn't really exist. We just disagree and I honestly don't want to spend time arguing about this eternal debate.


Companies get lower cost, higher output labor, for free. The money flows upward, enslaving the middle class with obscene debts trying to compete for prized jobs that used to require far less schooling. Meanwhile trust in authorities and institutions falls to new lows. I heard yesterday that 50% of populace now supports censorship, not of hate speech or anything like that, censorship of whatever popular opinion currently deems “misinformation.” Darwin, Newton, Galileo, all “misinformation” that benefitted from having far fewer gatekeepers than we do today. If this is what the democratizing force of higher education gives us, I’ll pass. We have more educated people and greater economic divide than ever before. The group with the greatest amount of “vaccine hesitancy” are those with phds! If that doesn’t scream elite overproduction you may be deaf.


>Galileo

Galileo was ordered to abandon heliocentrism by the Pope. Ultimately he was sentenced to prison for his views (commuted to house arrest).

He had plenty of gatekeepers.


There were fewer “elites” then. You are agreeing with me without realizing it. We now have what amounts to millions of little popes — nothing more than school children with high credentials in mostly nonsensical fields calling for widespread excommunication of heretics from their privileged positions in government and social media. If Galileo lived today we would never know of his positions- he would be immediately deplatformed, entirely erased from the historical record. This is elite overproduction.


>You are agreeing with me without realizing it.

No, I totally disagree with you and just pointed out the most obvious weakness in your argument.


Elite Overproduction is why we have woke and SJWs!


The world needs ditch diggers too.


I agree that societal instability stem from problems within the elites, but not sure it's due to 'elite overproduction' necessarily. I think it has more to do with divergence of interests within the elites. Even without elite overproduction, if the interests of a significant segment of the elites diverged from the interests of the rest of the elites, then you will have societal breakdown, civil war, etc.

One example is the american revolution where the one group of elites in the british empire wanted good relations with the indians to protect their fur trade while another group of elites wanted to invade native areas and steal their land. Don't think there was an issue of elite overproduction in the british empire in the mid 1700s. But there was an issue of divergent interests and there was no room for compromise as these competing interests directly conflicted with each other.

Another one is the civil war, where the interests of the industrial north ( the desire to protect industry via tariffs ) directly conflicted withe the interests of the agrarian south ( who wanted to remove tariffs ) so they could sell cotton/agricultural goods to foreign markets. Once again, I don't think there was an overproduction of elites in the north or the south in the 1850s.

But I guess if the elite overproduction was severe enough, it could structural conflict of interests. But my guess is that conflicts would naturally arise amongs the elites long before elite overproduction.


> Another one is the civil war, where the interests of the industrial north ( the desire to protect industry via tariffs ) directly conflicted withe the interests of the agrarian south ( who wanted to remove tariffs ) so they could sell cotton/agricultural goods to foreign markets.

Of areas where the north and south’s interests conflicted in the civil war that’s not the one most people call out.


Doesn’t mean that it’s not true or wasn’t also a big deal.


What you say is true but the poster didn’t really draw a connection to the main point that showed why this example was chosen so it’s a pretty random example to draw and kind of odd.


I like this analysis but am missing the next step. What material interests are in play today?

David Graeber argued that the people who controlled the two major parties in the United States, until recently, were into Finance, which benefits from deindustrialization and a weak middle class, and that Trump was different because his interests were in Real Estate, which actually requires domestic consumers -- hence trade war and antiglobalism.

(For someone as "far left" as Graeber, it sounded more apologetic for Trump than I expected.)

Now, I find this theory interesting, but I'm not sure I buy it.

Do you have another one?


Here's the interview I was remembering:

https://youtu.be/5Gq16RO2XB0

I can't tell if there's truth to it or if he's just really good at stringing words together.


I see a more meaningful split within the elites between industrial and post-industrial magnates. This maps better onto the positions and preferences of the two political parties.


I think ideology plays a strong role in addition to interests. In my mind (from reading about this a lot) there are roughly four political factions at the elite level (there's probably more variation at the grassroots level).

The most interesting is probably the Republican party establishment. I recently read books by two partisans on either side [1] (both prolific authors and MIT PhDs) and they were in almost total agreement historically on the people and timeline. The Republican party starting with Barry Goldwater underwent an internal revolution, and an upstart and more hard-line faction won, driving out the Eisenhowers and Rockefellers. The second author is a member of this group, and he believes strongly in very small government (for example, he is against the post-1950s government and believes America was last America in the 1930s).

The other Republican faction is roughly the Tea Party, which this [2] book goes into more detail. It started being a more major force in politics starting in 2010 after financial crisis. I think Boehner who was the speaker of the house at the time has talked more candidly about this period after leaving office. They are angry at the status quo and lean more to the conspiracy side of things.

On the Democratic side there are the left (Sanders, AOC, etc) and the center-left establishment. They both generally support social-democratic policies so can work together, but the center-left is more market-oriented (maybe you could also say capitalistic or neoliberal) and they also differ in political style. I think people are pretty familiar with Sanders etc but here's [3] an article by a center-left economist talking a little bit about how the nature of this coalition affects what policies are possible.

[1] Krugman's Conscience of a Liberal and Murray's By the People

[2] https://www.amazon.com/When-Party-Came-Town-Representatives/...

[3] https://www.bradford-delong.com/2019/03/passing-the-baton-th...


Nice post and references. This is basically how I see it too. You could increase the symmetry a little by adding to the Democrats the ghost of an FDR/LBJ "New Dealer" faction, corresponding roughly to the displaced Eisenhower/Rockefeller faction of the Republicans. You'd have something like:

- Eisenhower Republicans : New Dealers

- Reagan/Gingrich "small government" Republicans : Clinton/Obama/Blair market liberals

- "moderate Trumpists" (if there can be such a thing) : "Progressives" (who, I will add, are somewhat less "left" than Wokeists)

They form a 3x2 matrix. Each column represents a side, and each row the spirit of an age.

I guess you could call the rows: "Institutions", "Markets", "Movements".

Then, breaking the symmetry, the "Progressives" are moderated a little by the ghost of FDR/LBJ, while the ghost of Eisenhower lives on in the Never Trumpers, who... weirdly, might ally with what "respectable" Neocons remain, and turn into some kind of "establishment" figures aligned as much with the center-left market liberals as with anyone. Sort of a "deep state" faction without a party.


> David Graeber argued that the people who controlled the two major parties in the United States, until recently, were into Finance, which benefits from deindustrialization and a weak middle class, and that Trump was different because his interests were in Real Estate, which actually requires domestic consumers -- hence trade war and antiglobalism.

What does “into finance” even mean? And how can you be in real estate without being “into finance”? The real estate business is all financing.


I posted the interview in a sibling post to yours.

But yeah. What does Graeber mean?

The argument would go something like --

Many countries are in debt to the United States (immediate question: Isn't the US in huge debt to China?), which drives up demand for US dollars (since everyone needs to pay those back), which makes the dollar strong. As a result, if you have access to these dollars at low interest rates (i.e, are a bank), you can get lots of stuff from other countries "for free". But a strong dollar also has the effect of destroying domestic industry, because it makes exports too expensive for anyone to buy.

So he's saying "finance" is all these institutions with access to dollars, and "real estate" is a bunch of other institutions that are one more step removed from the Fed.

Something like that.

Like, are you a cloud provider, or do you run a lot of cloud jobs? Either way, you're "into the cloud", but you're on opposite sides.

That's about as much sense as I can make of it.

Or maybe it's all bullshit. Which would be funny, given the other things Graeber has written. I don't know.

I do notice it's weirdly aligned with RT's narrative. Not that that means it can't also be true.


>Many countries are in debt to the United States (immediate question: Isn't the US in huge debt to China?),

The US is in huge debt to the rest of the work because that is how a reserve currency works. You issue currency, in this case USD can only be created through debt, when it leaves the country to enter the world economy there is not enough USD in the US so the government has to do deficit spending or tax cuts (i.e. never retrieve the money it created).


Let's see if I can expand and work through what you wrote. Someone else may need to correct parts.

So the Fed/Treasury passes over the void, and in this vacuum forms a dollar and a T-bill, a particle/antiparticle pair. In this transaction, the Treasury sells a debt obligation (the T-bill), which someone buys with dollars. Then again, and again: We now have a few T-bills and dollars.

There are a few purchasers involved -- a few places to which T-bills flow, and from which dollars flow.

One: A T-bill flows to China. A dollar flows in the opposite direction (they used it to purchase the T-bill).

Another: The Federal Reserve buys a T-bill. This is "quantitative easing", or colloquially "money printing" (as it can be done within the current system). The Federal Reserve gets the T-bill, and the Treasury gets dollars, which then fund US government spending.

The T-bill/dollar current to China is superposed with an opposite current: Simultaneously, a different stream of dollars impinges on China, prompting another current of cargo ships in the opposite direction. These carry iPhones, and flip-flops, and everything else sold at big-box stores.

Enter countries in the US / IMF / World Bank sphere. These have dollar denominated debts (they used the loans to build (hopefully useful) infrastructure). Now they do something to acquire dollars, like accept a stream of tourists, or export coltan or palm oil. In the case of the raw materials, some go to the US (palm oil to food processors), and others to China (coltan to whoever makes tantalum capacitors, which eventually end up in iPhones).

In a net sense, then, dollars flow into the "developing" world, and resources flow out to the West, with those needing industrial steps of the "value chain" traveling via China.

And each of those dollars has a corresponding T-bill "antiparticle", held either in China (or another country), or at the Fed. This prompts another flow of dollars to the holders of all those T-bills, which we call interest. Those dollars now, can come from the sale of yet more T-bills.

Now here I realize that my metaphor is wrong. T-bills and dollars only exist in "pairs" when they are created by a QE transaction. Other T-bills attract dollars from outside (e.g. those sold to China).

Finally, I have left out the commercial banks. I'll need to work fractional reserve banking into this somehow.

This is all becoming pretty complicated. But it still feels like a simplified stock-and-flow model is within reach...


A big part of the issue is because the US effectively controls the world's money printer due to the use of USD as a reserve currency. This necessitates a trade deficit so that other countries can actually have access to USD.


The solution to that problem would be the introduction of regional bancors. The EU needs one for internal use. Technically the euro is a very "shitty" bancor which rather than being used as a unit of account, was directly adopted as a currency in each member state. In theory each country should have had its own branch euro which are then exchanged via the bancor. i.e. regional currencies. The bancor is actually just a barter exchange for currency. I mean that is what it boils down to.

https://en.wikipedia.org/wiki/Barter#Exchanges


In this context putting money into Finance means investing it through Wall Street via a range of their products such as derivatives/options trading, currency trading, HFT what have you. You could say making money through speculative assets and you won't be far off.

In other words Finance here refers to investing money not directly in real economy, such as building factories, houses etc., but in speculative products such as I listed above. It is possible that the money so invested in Finance somehow eventually makes it to the real economy. But since late 20th century the size of Finance relative to Real Economy has exploded. Multiple trillions of dollars worth of products are traded each day in Financial markets but in real economy (i.e., people buying/selling goods/services) only a fraction of that amount changes hands. So there is a growing disconnect between Finance and Real economy; which is why Wallet Street (Finance) and Main Street (real economy) have grown so much apart.

Coming to Trump and Real Estate. For a real estate builder, their domain expertise is around forecasting future needs and invest in building houses/office spaces. They invest some of their own money, borrow some from banks, build houses, sell for profit, repay banks and pocket some of that profit. So this part is well understood.

A builder can do well only when millions of people are earning well and are able to buy/rent their buildings. Falling wages is a really bad news for real estate firms. As Graeber says in that video, you can not export real estate buildings. So for them to do well the domestic real economy must do well. The Finance sector however has no such constraints. At the shortest of notice they can seamlessly move their money across the border chasing higher returns. There are times when Finance feels the effects of worsening real economy however as we saw in 2008 when falling housing prices (which were artificially inflated to being with, thanks to speculation) lead to Wall Street crisis.

I usually don't trust Wikipedia now a days but it does have good content in this case[1].

[1] https://en.wikipedia.org/wiki/Financialization


The vast majority of real estate owners are not bankers.


Too many people who think they're smarter than they are who want to be able to tell others how they should think and live.


This sounds eerily like India: the Brown British collaborators who inherited the British colony basically ran it as their personal inheritance.

There are clauses in the constitution to systematically deny any human rights to Hindus. The state restricts education and opportunities to English speakers. Basically the classical civilisation is denied any sense of agency. The situation isn’t much different from the treatment of native Americans back in the barbaric days.

Little wonder the Anglo-Indian class and their brainless supporters in the West are up in arms anytime a more rooted Indian leadership emerges.

There is much to criticise the current idiots in power but the old colonial crooks are so stupid and insipid that in their toxic discourse of ‘Hindu fascists’ ‘Hindu Nazis’ (in a country which has oppressed Hindus for a millennia) they’re only digging their own graves.


> There are clauses in the constitution to systematically deny any human rights to Hindus

Which laws are those?


This is not at all specific to India, FWIW. Many post-colonial countries has been in the same boat for a long time, which can only be a huge source of instability and conflict.


I think this goes beyond status afforded to you by wealth and education. If you look at beauty standards for both men and women, they are converging around one or two achievable looks via make-up/filters/diet/exercise. Sure, we could all end up with the same credentials, but what’s everyone’s plan when we all also look the same?

I can’t help but laugh.


This is very dependent on your personal bubble. Plenty of communities make fun of the "beauty" standards you mentioned. If anything, making fun of these standards seems to me to be at least as popular in our general culture today.


They make fun of them and yet the commercialism indicates this hasn't quite been won yet.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: