There is a story in Hitchhikers about a nightclub full of robots. No real people. Just loud music and bots trying to get the attention of non-existent human customers. Adams predicted the current state of twitter long before even the internet was a thing.
Yes, the case would be stronger with specific examples. However, I did not find it alienating, as examples of these 6 myths readily come to mind. We see people appeal to expertise all the time, rather than using their expertise to explain. There are lots of examples of people trying to "solve" economics problems rather than, as Thomas Sowell puts it, realizing that there are no solutions but only trade-offs.
I work at a FAANG (and obviously, I'm not a company spokesperson, just sharing my own experience). Those who are passionate about interviewing internally all seem to agree on not asking leetcode questions. I know leetcode questions get asked anyway, but there's pretty clear internal guidance and training for interviewers saying not to use them.
At least part of the problem is that leetcode questions are easy to ask, and most interviewers don't want to go through the hassle of coming up a question that scales well to the candidate's experience and knowledge.
> “We should phase in new energy sources and technologies when they are genuinely ready, economically competitive and with the right infrastructure,”
On the one hand, that seems like a reasonable approach. On the other hand, if we only have on the order of decades of known oil reserves, we'll have to phase out of oil use at some point (likely in most of our lifetimes), no?
> It’s difficult to type and think at the same time
I'm pretty good at typing, but not anything special, at around 90-95 words per minute. I type while thinking all the time, although usually the thinking is the slow part.
Other have mentioned speed and accuracy, but one thing that has been a huge advantage to me is being able to type confidently enough while not looking at the keyboard or screen (or only checking infrequently). It is pretty helpful to be able to have a conversation looking at the person while taking notes, rather than saying "let me write that down" every 5 seconds, interrupting the flow of the conversation.
I don't know much German, so I can't read the original article.
It would be helpful to know if by "they might otherwise engage in climate protests," the people in question had planned to just say things but otherwise stay out of the way, or if rather the people in question had made public their plans to break laws (like blocking traffic, which many climate protesters have been doing lately). In the one case there is no crime, and governments shouldn't be detaining people just in case they commit a crime later. In the other case, even if someone isn't a terrorist, planning on breaking the law is itself a crime, and it's not "preventative detention."
They haven't been convicted of anything, they're being detained anyway. This is not the same as pre-trial detention, if Google Translate is a remotely accurate rendering of the linked article:
"Legally, this police approach is called preventive detention because it is not detention for a crime that has been committed. The police laws of the different states allow this for different lengths of time. In Bavaria, up to one month in prison is permitted, which may be extended by a judge for a maximum of another month.
The so-called preventive or preventive detention is very controversial. The relevant laws were originally created to prevent terrorists from carrying out attacks. However, this form of detention is now also permitted in the case of the “imminent commission or continuation of an administrative offense of considerable importance for the general public,” as the Bavarian police law states. Lawsuits against this have so far been rejected in Bavaria. However, a final clarification about the legality of this approach is still pending."
I had a math professor in college, whom I had for Real Analysis. He told us that one thing that helped him learn was to ask somewhat-obvious sounding questions to try to make connections to things to check understanding. There are at least two good reasons for this: (1) if you don't understand the basic (unsurprising) things, you probably won't understand the more nuanced things, and (2) what counts as surprising varies with the audience.
If someone were to come to me looking for advice along these lines, I'd say: sure, focus on the surprising thing, but it has to be grounded in the familiar, and what counts as "familiar" depends on the audience.
> We have a good leader, which makes all the difference.
I think this is the trick right here. Our paycheck might come from such-and-such company, but really we work for the people in our management chain. I'm currently at Amazon, and I've been pretty lucky with having good leaders, but part of that is because I've had the luxury of being selective (my first director in my time at Amazon was someone I had worked with previously, and when I switched teams I joined to work with someone I knew).
Not everything that we learn has to be learned in college. Not sure if 100% accurate, but a quick google search shows WVU tuition is about $9k for in-state, $25k out of state... does it really make sense for someone to spend $100k (or $36k, for West Virginia residents) to learn puppetry? If that's your thing, great, why not go apprentice yourself somewhere to learn that skill instead?
Or offer a class or two in puppetry, that someone as a theater major can take. Not everything that isn't STEM is "liberal arts" - as this article says, puppetry by itself is pretty niche, and the point of liberal arts is to be more broad.
>Not everything that we learn has to be learned in college.
This honestly is the crux of the issue with higher education. In fact, most things we learn outside of STEM in academia can be learned elsewhere - literature, languages, arts, etc.
I'm 100% for the humanities, and majored in them myself. But they shouldn't cost a fortune to study, nor should they really be in a narrow results-focused academic environment. Now, are they worth learning? Of course they are, and I think they make us better human beings. But spending dumb money to learn something niche that can be learned and practiced on one's own time is probably not a great use of time nor money.
I studied multiple languages, and while they are incredibly useful and absolutely worth learning, academia is not the place to do it. While I did do the Middlebury intensive program (which is hard, and the results were great), I improved the most when I actually went and lived in a country that spoke the language I wanted to improve at. Some of the best speakers of a foreign language that I know got so good by having a girlfriend who was native to the language that they wanted to learn. No formal education needed.
If I wanted to learn puppetry, I'd use YouTube to learn the basics, and then perhaps even document my progress through making videos of my own. I'd probably make a few bucks on it too. People do the same thing for learning animation, acting, music production, video production, etc.. why can't puppetry be the same?
> In fact, most things we learn outside of STEM in academia can be learned elsewhere - literature, languages, arts, etc.
Why specify STEM as being an exception to that rule? Are people unable to learn programming or math outside of academia? It's a shame the arts aren't seen as valid jobs and so are unworthy of investment.
This is critically important, because in my experience it can work both ways.
Sure, you can learn writing and exposition on the side, but you'll learn it faster to a greater precision in good courses. You can also learn math and computer science, or whatever "on the side" through research projects and work. It might take longer but it can happen.
It always strikes me as strange to see all these postings to open STEM coursework, but then this idea that someone who is really bright who just didn't happen to major in a STEM program can't pick up a lot of STEM experience through non-coursework experiences.
I know of someone who had history undergrad and grad degrees, for example, but who became heavily involved in computer science and imaging research because of some project involving imaging some artifacts or fragile records (I don't remember what, although it was in north/east africa). It launched a new line of work that was pretty much all computer science and image processing. I suspect by the end of it, which was years, they were probably comparable to someone with a comp sci bachelor's degree in their knowledge. I certainly got the sense they were more competent in computer science by the end of it than some of of the CS bachelor-level graduates I've worked with.
Stuff like this happens all the time. I'm not saying that there's no value in a STEM degree, but there's this weird fixation on certifications in general that's grossly inaccurate. My sense is it happens to make HR departments' jobs easier, nothing more.
We live in this world where it's assumed competency is equal to test scores on a general achievement test, and skillset is equal to degree or certificate. It leads to these gross distortions and over and underemployments — not only in the sense of who is being hired as an employee, but who is being recruited to accomplish tasks and solve problems in professional capacities in general.
Depends on the STEM. A whole lot of experimental science depends upon lab equipment that is either too expensive, too dangerous, or both to try and self-learn. Or it might be totally inaccessible no matter what. If you want to be an astronomer, you're gonna need time on one of the 4 meter scopes that is on top of a mountain that they don't just give public access to. Even if you had the billion dollars to build one in your backyard, the light pollution would make it useless. Wanna do medical research, you can't legally except under the supervision of somebody who already knows what they're doing and has access to a clinical population.
Programming and math are probably the two you can reliably self-teach, but probably the only two, and "programming" in at least some contexts might also benefit from extremely high performance or large capacity equipment you're not gonna find in a typical homelab or be able to afford renting from a public cloud.
I think STEM is suited to academia in multiple ways: first, the chances of success at learning it seem to be higher in a structured, rigorous program compared to independently. One of the problems with online universities is that they don't apply enough deadline pressure so in many cases only 20% of students who sign up for a course even finish it. If you want to put four years into something and reliably get somewhere then an academic program still seems like your best for that. The other piece of the puzzle is that a STEM degree is a useful credential that can help substantially in the job market compared to being self-taught. Furthermore, most STEM fields have job markets where salaries that can realistically make getting a degree a worthwhile investment are obtainable. I think it's this combination of good way to learn, valuable credential and decent cost/benefit that make STEM degrees make more sense.
> first, the chances of success at learning it seem to be higher in a structured, rigorous program compared to independently
That seems an argument for all education to take place inside academia. Keep in mind that liking to read history books is not equivalent to a university degree in history.
> Furthermore, most STEM fields have job markets where salaries that can realistically make getting a degree a worthwhile investment are obtainable.
Yeah, that's ultimately it, it's seen as more valuable because you can make good money in tech. Though I'd argue that much of that may not be the case - are those with science and math education making money hand over fist?
Math education is the second highest paid undergrad in arts and science and was number one ahead of CS for a long-time. Many people with math degrees end up in high paying technology or statistics jobs. Math is both an incredibly marketable skill and the program often indirectly selects for high IQ which is correlated with success to a certain degree. You generally aren't going to do abstract algebra or topology for a career unless you are a math professor but being good at abstract algebra or topology generally requires a lot of skills that are useful in the labour market.
As for the main point, I think the best form of education for something does depend on what the outcomes are. If you're getting a degree that doesn't qualify you pay back the costs of that education maybe alternative methods have better outcomes (we'd accept a decrease in educational attainment for cost). If being a small amount better at something is lucrative we should optimize for performance more than cost.
A lot of STEM requires expensive equipment that most individuals cannot afford with either money, space, and time to operate safely; though they are exceptions.
Also when a history major forgets a fact or two in their realm, it’s not as catastrophic as an aerospace or civil engineer forgetting a key fact in their domain. The gains from STEM are also more concrete and immediate compared to the liberal arts.
Some parts of STEM are very accessible outside of an institution (math, CS, theory) but for many fields of engineering and hard sciences you need expensive equipment and materials that may also be hazardous to work with if you’re untrained. Even for fields that are mostly done on computers, there’s still the issue of the software used being too expensive for a typical individual to purchase.
The advantage and disadvantage of an apprentice system is that practitioners cannot massively scale up their pupil count (at least normally).
This implicitly aligns incentives to a static arrangement where apprentices are training to replace their masters. It is immediately obvious to everyone which fields either can’t afford to pay their apprentices - or which fields have exceptionally long times until the apprentice can work independently. Practitioners are also strongly incentivized to only accept strong pupils who they expect to work with for many years.
The con of this model is that a great master can only teach a small number of apprentices - a great master may also be a terrible teacher.
Undoubtedly, there are some industries that should follow the apprenticeship model - however these industries also massively benefit from a university system which transfers the cost of training to the student/government. There likely should be some alignment between university costs and educational outcomes on a per major basis.
I tend to agree with this statement over "the economic model for teaching everything is terrible". Bear with me as I'm trying not to be petty by taking issue with the word "everything"[0] and I agree that this statement would be true about the majority of how traditional in-person education is done in the United States, especially at the (public) school and college/university level, however, I believe that issue lies more with the system, itself, than the idea that it's impossible to optimize an economic model involving education.
Looking at it "from a step back" -- from the perspective of any other product -- the cost is the teacher/facility/equipment and environment to support a touchy activity (learning) and multiple human beings attempting to do so at the same time. It's probably the most expensive way to transfer knowledge available to us and it's the default way most of us are taught.
In-person hands-on teaching falls victim to the basic problems of scale. Profits increase as the number of students per teacher increases but -- in most cases -- this negatively impacts the quality of the delivered education.
I don't think it's a "wild guess" to say that a lot of us meandering in the comments are self-taught. Sure, we went to college. Some of us even have advanced degrees[1]. But if you write software -- daily -- you've largely learned the details from somewhere other than a classroom. Most of the time it's been "for free" by reading others' code, online tutorials, actual documentation, etc. These are extremely efficient ways of both teaching and learning -- the single effort put into teaching is able to be consumed by limitless numbers of people.
There are many modalities to teaching/learning that are more efficient/provide for a better "economic model for teaching" than "traditional in-person education". One that we seem to have stepped further away from is apprenticeships. Puppetry -- though I have no experience in it -- is probably something that deeply benefits from in-person knowledge transfer and it seems like the kind of work that has probably only been taught via apprenticeship in the past.
[0] Love it when my kids do that ... "oh yeah, but what if ...?"
[1] That's not meant to imply anything negative about such degrees.
Apprenticeship is an economic model. As a skills-transfer model, it tends to be very good at common practices, mediocre at rare practices, and terrible at theory. In order to get expertise in a field, one generally needs to understand the underlying theory.
We used to teach doctors by apprenticeship. Then it turned out that underlying theory was incredibly important to diagnosis and treatment for all the uncommon ailments -- and in a large, long-lived population, uncommon ailments come up a lot. It turns out that medicine is such a large field that doctors need both formal schooling and apprenticeship -- so there's a required supervised period.
Puppetry is a child of acting and sculpture and clothing. There are specialists who are great at one part and not at others; there are generalists who do everything. The practical parts of these fields are probably amenable to apprenticeship; the theoretical parts, not so much.
Many thanks for the reply; you make some excellent points.
I agree completely on the "theory" side of things. One of the reasons CS degrees have value is in these foundational aspects. While they can be learned outside of that environment, it's challenging and most people don't take the time to do it.
I think there's a "happy medium" between pure apprenticeship and what we (mostly) have, today. You brought up the model used in medicine -- I feel this model is more like the mixed apprenticeship/university model with a lot of hands-on work in the field under apprentice-ship like conditions with increasing responsibilities/exposure to actual patients. Of course, my exposure to how all of that works begins and ends with medical dramas on TV so I may be imagining that "ideal scenario".
My own situation was quite unique and I feel it was pretty ideal for myself and the company that I spent my college years working for.
I'd started building small business networks for folks in my teens and at 19 interviewed for a job supporting network rollout at a regional-LEC-turned-national telecom. I'd just started college full-time in hopes of completing my degree in 4 years -- I'd have to push this to 6-8, instead, but for accepting a longer time in school, I'd get experience working in an IT department[0] and they paid for my tuition/books.
The tuition reimbursement policy was generous -- it covered at least two classes a semester with books, four semesters a year at 100% if you scored at least a 3.2 (or pass in pass/fail scenarios). If the degree was in your field of work you required no approval; if outside of it, you could still receive reimbursement with approval from HR (which was always granted if the degree was useful to any job in the company, so it was nearly always approved). Some of work time could even be logged as "working on my degree".
You had to pay back a percentage of any class you took two years prior to quitting[0], but IIRC even that was never more than half.
On the company's side, the program encouraged loyalty. Once you're in it, it becomes a big factor in "am I going to take this new job?" -- I turned down two excellent offers because they lacked a similar program and I wasn't encouraged that I would be otherwise supported in completing my education[2] Myself and two other coworkers stayed there 17 years -- at least nine of those were "while I was getting my degree" or waiting for the pay-back period to end. I turned down two excellent job offers during that time because they lacked a similar program and I really appreciated the fact that I was earning a respectable salary while completing my degree and accumulating exactly zero student loan debt.
[0] My hope at the time was to transition to the software development team but I started supporting migrations from mainframe terminals to networked PCs -- some of which were allowed to connect to the internet. Within a year I was writing software full time but (thankfully) not on any of the actual development teams at the company.
[1] If you were laid off you were not required to pay them back.
[2] That sounds a little entitled -- and it is. But I had support at my current job so it was a "real thing" I would be giving up. Beyond that, though, it spoke to the overall organizations' feelings about professional development.
There has been a big push in the last 20 years or so to make sure that everyone makes a "living wage". This, in turn, makes it nearly impossible for people who want to learn a trade while making something (anything) from actually doing so. Their work and time is basically worthless. They can't be paid accordingly, so they just don't...
Unless you want to volunteer, but most organizations don't value volunteer time because it isn't "real work".
If you can take out a $100k loan and 4 years off work to get a degree then you should be able to do the same to apprentice somewhere. I get what you are saying, though.
One distinction is between being an apprentice to some craft that doesn't pay much under the best of circumstances and being an at least minimally useful apprentice/helper to a skilled trade that is pretty highly compensated. It's a lot easier to pay the latter decently than the former.
> There has been a big push in the last 20 years or so to make sure that everyone makes a "living wage". This, in turn, makes it nearly impossible for people who want to learn a trade while making something (anything) from actually doing so.
Checking Austin IBEW, it goes from starting apprentice at about $17/hr+benefits and goes up every year guaranteed until hitting about $34/hr+benefits at base rate for a journeyman. "Living wage" isn't going to impinge on that except at the very bottom.
However, people don't want to do trades because it's really hard work.
Many people object to unpaid or minimally paid internships but that's pretty much the only option for many types of apprenticeships where the apprentice is learning but isn't really doing anything very useful.
The problem ultimately boils down to the 710% increase in tuition.
If college education had retained the same tuition costs as it did 30 odd years ago (increasing at the rate of inflation only), that puppetry major would have cost something like $20k over 4 years, probably been paid off by the student doing part time work at the library, etc. and the student would have graduated debt free with some party time work experience and taken a few years to see if they could parlay their experience into the next Muppets. If rent hadn’t blown up the way it has either, they would have supported themselves by doing a bartender’s job or server’s job, and as the thing wasn’t working out could have started doing some courses on the side, and a few years later either leveraged all their work experience or these side courses into a real job, or taken up a master’s and made a career switch while having given something really interesting and different a go, for which the entire world would be better off.
However, massive increase in tuition costs and rent with no corresponding increase in the safety net means risk taking has to pretty much been eliminated at the individual level which will cost society overall.
> does it really make sense for someone to spend $100k (or $36k, for West Virginia residents) to learn puppetry?
No. School should be free.
> why not go apprentice yourself somewhere to learn that skill instead?
Because schools should help set those internships and apprenticeships up.
> puppetry by itself is pretty niche, and the point of liberal arts is to be more broad
You have that strangely inverted. Because liberal arts are so broad as to include the arts and because the arts are so broad as to include puppetry, the point of liberal arts is to offer niche studies like puppetry.