Hacker News new | past | comments | ask | show | jobs | submit login

Your argument is not "this is a thing which should not be taught" but rather "the economic model of teaching this is bad".

And I agree. But I would also say that the economic model for teaching everything is terrible. So, improving that model will make everything better.

Attempting to special-case a field does not improve things.




   > the economic model of teaching *this* is bad.
I tend to agree with this statement over "the economic model for teaching everything is terrible". Bear with me as I'm trying not to be petty by taking issue with the word "everything"[0] and I agree that this statement would be true about the majority of how traditional in-person education is done in the United States, especially at the (public) school and college/university level, however, I believe that issue lies more with the system, itself, than the idea that it's impossible to optimize an economic model involving education.

Looking at it "from a step back" -- from the perspective of any other product -- the cost is the teacher/facility/equipment and environment to support a touchy activity (learning) and multiple human beings attempting to do so at the same time. It's probably the most expensive way to transfer knowledge available to us and it's the default way most of us are taught.

In-person hands-on teaching falls victim to the basic problems of scale. Profits increase as the number of students per teacher increases but -- in most cases -- this negatively impacts the quality of the delivered education.

I don't think it's a "wild guess" to say that a lot of us meandering in the comments are self-taught. Sure, we went to college. Some of us even have advanced degrees[1]. But if you write software -- daily -- you've largely learned the details from somewhere other than a classroom. Most of the time it's been "for free" by reading others' code, online tutorials, actual documentation, etc. These are extremely efficient ways of both teaching and learning -- the single effort put into teaching is able to be consumed by limitless numbers of people.

There are many modalities to teaching/learning that are more efficient/provide for a better "economic model for teaching" than "traditional in-person education". One that we seem to have stepped further away from is apprenticeships. Puppetry -- though I have no experience in it -- is probably something that deeply benefits from in-person knowledge transfer and it seems like the kind of work that has probably only been taught via apprenticeship in the past.

[0] Love it when my kids do that ... "oh yeah, but what if ...?"

[1] That's not meant to imply anything negative about such degrees.


Apprenticeship is an economic model. As a skills-transfer model, it tends to be very good at common practices, mediocre at rare practices, and terrible at theory. In order to get expertise in a field, one generally needs to understand the underlying theory.

We used to teach doctors by apprenticeship. Then it turned out that underlying theory was incredibly important to diagnosis and treatment for all the uncommon ailments -- and in a large, long-lived population, uncommon ailments come up a lot. It turns out that medicine is such a large field that doctors need both formal schooling and apprenticeship -- so there's a required supervised period.

Puppetry is a child of acting and sculpture and clothing. There are specialists who are great at one part and not at others; there are generalists who do everything. The practical parts of these fields are probably amenable to apprenticeship; the theoretical parts, not so much.


Many thanks for the reply; you make some excellent points.

I agree completely on the "theory" side of things. One of the reasons CS degrees have value is in these foundational aspects. While they can be learned outside of that environment, it's challenging and most people don't take the time to do it.

I think there's a "happy medium" between pure apprenticeship and what we (mostly) have, today. You brought up the model used in medicine -- I feel this model is more like the mixed apprenticeship/university model with a lot of hands-on work in the field under apprentice-ship like conditions with increasing responsibilities/exposure to actual patients. Of course, my exposure to how all of that works begins and ends with medical dramas on TV so I may be imagining that "ideal scenario".

My own situation was quite unique and I feel it was pretty ideal for myself and the company that I spent my college years working for.

I'd started building small business networks for folks in my teens and at 19 interviewed for a job supporting network rollout at a regional-LEC-turned-national telecom. I'd just started college full-time in hopes of completing my degree in 4 years -- I'd have to push this to 6-8, instead, but for accepting a longer time in school, I'd get experience working in an IT department[0] and they paid for my tuition/books.

The tuition reimbursement policy was generous -- it covered at least two classes a semester with books, four semesters a year at 100% if you scored at least a 3.2 (or pass in pass/fail scenarios). If the degree was in your field of work you required no approval; if outside of it, you could still receive reimbursement with approval from HR (which was always granted if the degree was useful to any job in the company, so it was nearly always approved). Some of work time could even be logged as "working on my degree".

You had to pay back a percentage of any class you took two years prior to quitting[0], but IIRC even that was never more than half.

On the company's side, the program encouraged loyalty. Once you're in it, it becomes a big factor in "am I going to take this new job?" -- I turned down two excellent offers because they lacked a similar program and I wasn't encouraged that I would be otherwise supported in completing my education[2] Myself and two other coworkers stayed there 17 years -- at least nine of those were "while I was getting my degree" or waiting for the pay-back period to end. I turned down two excellent job offers during that time because they lacked a similar program and I really appreciated the fact that I was earning a respectable salary while completing my degree and accumulating exactly zero student loan debt.

[0] My hope at the time was to transition to the software development team but I started supporting migrations from mainframe terminals to networked PCs -- some of which were allowed to connect to the internet. Within a year I was writing software full time but (thankfully) not on any of the actual development teams at the company.

[1] If you were laid off you were not required to pay them back.

[2] That sounds a little entitled -- and it is. But I had support at my current job so it was a "real thing" I would be giving up. Beyond that, though, it spoke to the overall organizations' feelings about professional development.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: