I can see why this topic is interesting to people on Hacker News, because "self-study" is not just an option in software development, it's really the only way. Yeah, a lot of us have CS degrees or degrees in related fields, but in the end, you have to read, absorb, prototype, evaluate, adopt, or reject thousands of pages of dense material every year to stay current.
I really do think that software developers may be unusually well prepared to study law this way, because developers really are accustomed to massive amounts of self-directed learning.
A big difference between the software development and the legal world, alluded to in the article, is that the legal world is the most snobby, credential bound field I've ever observed.
While I've occasional seen developers express frustration that this or that employer strongly prefers graduates of a particular school -- by and large if you have a string of accomplishments all doors are open to you. The legal field is the exact opposite. There are entire career paths that are simply only available to those that attended one of a handful of law schools (the legal academy being perhaps the worst).
If you ask someone from a top firm, the advice you'll get is that law school is a great option if you're able to get into a tier one law school and can reasonably expect to be in the top ten percent of your class. There's a bit more wiggle room if you're at one of the top ten schools, but not much. Absent that, you're not getting hired by one of the big law firms that pay enough to make any progress towards paying off your student loans. You'll be lucky if you're able to keep up with the payments. If you're stuck in a shit contract review job, like a lot of new graduates are, you're screwed ten ways to Sunday.
Some regional firms recruit from lower-ranked schools that are located nearby, but in some ways, the competition for those few spots is even worse.
Needless to say, I'm glad that I changed my mind about going to law school.
Even if you win you lose, because those top firm jobs tend to be brutal quality of life wise.
There are legal jobs that have good work-life balance, involve interesting work, pay decently (if not great), don't require you to hustle for clients, and are fairly well thought of by society at large: federal court judge and tenured law professor. For both it would behoove you to go to Yale or failing that Harvard.
(Keep in mind when looking at these links, Yale's class size is about 1/3 Harvard's.)
You cannot reasonably expect to be in the top ten percent of your class. It's different from science, cs, and engineering classes. First, because the LSAT is pretty good at matching you with classmates who are similar ability. There are definitely exceptions but you have no idea if you are one. Second, the law school environment makes everyone compete at a high level. It's not a cs class where half the class is more interested in their side projects or finding a date for the house party Friday. Third, the classes are graded on a bell curve and the tests are subjective. Even if you know 100% of the rules, you won't get 100%.
The exams are pure application of rules. You argue for and against liability/guilt/etc.
And that's a huge part of the problem. Even for someone who is brilliant, law school is a massive, quarter-million dollar gamble. You're making a decision based on assumptions about what law school will be based on a very different experience had as an undergrad. If you're wrong, you're in a financial position that's going to take decades to get out of... if you ever do.
Selection bias? The credentials of the type of people hired to be in such positions help those people get in those positions. Even without "discrimination" so to say, when you get hired right off uni instead of spending a few years freelancing or being a code monkey at some random shop, it makes a tremendous amount of difference to your career...
That's just a legal means of hiring people like you. You can't screen for people who look like you or originated from the same place, but you can narrowly focus recruitment.
Sure, a top-tier degree will absolutely help you get a leg up.
But there are law firms who pretty much only hire top 5 graduates. This is different from even Google, where a Stanford degree might help but there are still plenty of self-taught or non-Ivy developers.
Because when choosing a search engine, nobody would care if it said "Brought to you by top 5 grads", while in law, that gets customers to pay top dollar.
Maybe people care more about outcomes when they're spending their own time instead of someone else's money?
I think it has more to do with quality (and bullshit) detection. If a search engine sucks, it's pretty easy to try a different one and see if the results are better. If you get hosed in a law suit, it's not as clear that the outcome would have been better with a different firm. Pedigrees matter a lot more when it's hard to directly evaluate the quality of the output so decision makers default to signalling.
> Yeah, a lot of us have CS degrees or degrees in related fields, but in the end, you have to read, absorb, prototype, evaluate, adopt, or reject thousands of pages of dense material every year to stay current
So do most practitioners in careers that are typically associated with or require a university degree: actuaries, accountants, engineers, lawyers, doctors, teachers, etc. all have to stay current with their respective fields and do so primarily through self-study.
Of course there exist people in all of those careers who don't. But the same is true in software.
In fact, I'd argue it's a little harder for licensed professionals to not stay current with their fields due to the requirement for continuing education credits. Nevertheless, I don't see how any professional can be competitive in their field without keeping current, especially as the world continues to flatten.
The idea that people in tech are "unusually well prepared to study law" through apprenticeship as framed by the GP is strikes me as arrogant and nescient to what other professionals do. Actually… it reminds me of an Xkcd: https://xkcd.com/793/
I agree. The main difference is that all of the fields you've listed already have a formalized study and exam path. Software is one of the few knowledge based fields that you can enter purely on self-study at the highest level (I'm not sure about actuaries, if you studied math on your own, could you enter the actuarial field? Not sure about this one).
As a result, I wonder if maybe software developers are a little more inclined, as a "profession", toward paths that are less formalized.
> if you studied math on your own, could you enter the actuarial field? Not sure about this one
Yes (edit: accidentally said the opposite of what I meant!). What you need to be able to do is pass the exams. BUT -- plenty of people with math degrees fail these exams. It's a hell of a lot harder than learning how to hack together a RoR site.
You can also e.g., fill a number of supporting roles around actuaries without a college degree and then demonstrate competence by taking exams.
> I wonder if maybe software developers are a little more inclined, as a "profession", toward paths that are less formalized.
I think the fact that everyone in software has the same job title makes a huge difference wrt medicine and engineering.
The guy hacking together cookie-cutter single server websites calls himself a software developer/engineer just the same as the Ph.D. hacking on compilers or ML algorithms. Everyone inside the field knows there's a huge difference between these two jobs, and that becoming the latter is basically requires a degree (perhaps even an advanced one) or really extraordinary dedication while the former probably doesn't even need a degree at all.
So when someone says "do I need a degree to become a software developer? / Is a degree the fastest path to becoming a software developer?", it's an ill-defined question. What do you mean by software developer? Do you want a good salary doing whatever in boom times or do you want to do really technically interesting work with job security even in bust times?
Conversely, most other fields tend to differentiate between job titles that require lots of skill and jobs that don't.
For example, the difference between various levels of non-doctor healthcare workers (the whole range of nurses, and then everything up/down from there) is very institutionalized. Something similar happens in engineering -- plenty of non-engineers do engineer-y things in engineering firms, but some things you'd be insane to delegate to someone who doesn't have formal engineering education.
The guy hacking together cookie-cutter single server websites calls himself a software developer/engineer just the same as the Ph.D. hacking on compilers or ML algorithms.
From what I can tell, none of the lead creators of PHP, Python, Perl, Ruby or JavaScript actually have PhDs. The original author of Turbo Pascal and chief architect of Delphi and now of C#, Anders Hejlsberg, didn't even finish his undergraduate degree.
It's not just a matter of similar titles, the paths are really less formalized.
This comment completely ignores and perhaps even intentionally muddies the central thesis of my parent post, which was: there is a distinct and codified divide between "grunt work you can do with minimal training" and "seriously difficult work" in other fields that does not exist in software.
For instance, the sentence right after the one you quote ends with: "...or really extraordinary dedication". The handful of people you mention in your post, I think, count as extraordinary.
To reiterate my point:
The IT guy who helps hack out some spreadsheets isn't an actuary. The nurse who changes sheets isn't a doctor. But in software, there's no distinction -- in terms of externally recognized job titles -- between the guy hacking on wordpress templates or apps that burn my phone's battery and steal my data for no reason, and someone working on a piece of critical infrastructure.
I can't really respond to your anecdata without starting a flame war, but suffice it to say that perhaps some people would consider PHP or Perl or JavaScript cogent counter-examples to the point you're trying to make.
Finally, lots of actuaries without degrees rise to the top of their organizations and get to work on really cutting-edge stuff. Those people are extreme outliers. The same exact thing is true in software. Literally the only difference is that actuaries don't have the same sort of blindness to statistics that software people do, and I think the reason is that there's a clearer distinction between "grunt work related to actuarial work" and "professional-grade actuarial work".
If we made that distinction in software, then I think the perception that you can do real software engineering without a degree would shift far closer to "Exceptional" than it is now.
I did mostly ignore the central thesis of your post, which is why I quoted the specific part I was replying to.
You say they count as extraordinary, but by what measure? Reading the origins of those languages, they don't strike me as such. Unless you consider them extraordinary because they wrote languages and compilers without having advanced formal education, but then your claim seems rather circular.
I can't really respond to your anecdata without starting a flame war, but suffice it to say that perhaps some people would consider PHP or Perl or JavaScript cogent counter-examples to the point you're trying to make.
Well, no, because the point I'm making is that one can be employed to hack on some of the most used languages and compilers without having a PhD, and these examples strengthen it.
Now, it might be that these languages are poorly built and which is why many non-PhDs can get jobs building them, but that is an explanation, not a rebuttal.
And I'd point out that one doesn't have to be blind to fail to see statistics that aren't there. Would you care to provide them?
> one can be employed to hack on some of the most used languages and compilers without having a PhD, and these examples strengthen it.
Of course. You can most anything without a degree, with very few exceptions (see: the article).
But the degreed : undegreed and phd : nonphd ratios in language design/compiler implementation are much higher than in software development more generally. The same is true of other technically challenging things. If you agree with that observation, then I think we're violently agreeing or you're reacting to the tone rather than the substance of my post.
> Would you care to provide them?
Anecdata, and bearing in mind that "originator of a language" is a very, very small subset of all people working on compilers / language design (and that even there, I think my observation holds -- especially for degreed : undegreed, and probably even for phd : nonphd). And that compiler/language design is only one very small segment of what a reasonable person would consider to be core infrastructure.
I don't have the time or the interest to run this study. Feel free to do so.
I find a lot of things to be interesting work... I enjoy learnign new line of business logic and rules more than writing software. I'd rather think about data storage-flow in application than conceptualizing it all, though I enjoy that too.
I spent a little bit of time (less than a year) in upper management even, and hated it. I wouldn't be happy optimizing algorithms to serve ads better, or track financial systems. There's a place for that, sure... but it just isn't interesting to me.
There's fun in trying to keep current with new things... Hell, I've been sticking with node.js since inception, and that's been a very wild ride. I like web based applications, despite other people hating them. In the end, I like delivering usability and solutions to the end user. I started out as a designer/artist in a former life (been in software for nearly two decades now).
The conventional path was never the right thing for me. That doesn't mean I'm not able to do type of work being done at Google, Netflix and the like... And I've even considered it. Just because I don't have a degree does not mean that I haven't studied, and am not able to do "really technically interesting work."
I've spent the better part of twenty years studying, designing and developing software systems, with far more real experience than someone stepping out of college. To think that someone who's spent that much time is always inferior to someone with a formal education is simply ignorant.
As to engineering, take a look at where Radar came frome sometime. It wasn't formal educational engineers, it was guys with on the ground experience. There are a lot of truly great acheivments in humanity that didn't come out of a formal education.
Okay, going to stop now, I just found the parent post very arrogant and frustrating. Arrogance irritates me more than any other behavioral trait.
> (I'm not sure about actuaries, if you studied math on your own, could you enter the actuarial field? Not sure about this one).
Yes, anyone can take the exams, there are a handful. And if you pass them all nobody really cares about your degree. But from what I gather you'd better be pretty sharp to pull it off.
Except maybe for doctors as I don't have any in my family, the rest are simply trained by their employer. That is hugely different from developer that are expected to not only train themselves at home but also need to research the field to know what they need to train into.
The engineers in my family think I'm a failure because I don't seem to have enough value for my employer that it tries to train me. ( also the "it's just computer programming" that does not help )
Though for applying for new programming jobs, it all comes down to algorithms and data structs for learning. I think a lot of the neat stuff programmers learn or do on their own is completely useless for whiteboard interviewing questions. "Look at this open source library I created and everyone uses!" seems to be useless for programming interviews.
A missing ingredient in Software Development and CS work is credentialism. There's no software "Bar Exam" you must pass before you are legally allowed to offer your services.
And there's no "Supreme Court of Coding" either.
But there absolutely should be both.
Programming is littered with people "just trying things" and "hacking" and other crudities like "experimentation" and "exploration."
We need regulation, regimentation and a predictable level of service.
You should also be required to obtain a license (after a 90 day waiting period) for each computer you own, even if it's not going to be used for programming.
It's the only way we can reduce the number of data breaches and crashes.
I hold a professional engineering license, having passed the computer engineering exam. The software engineering exam was first offered a year or two after I took my exam. (In Texas, the first state to offer the software engineering exam.)
Anyway, I passed the test with no problem, and I think all the licensing exam does is to filter people who are totally incompetent. It guarantees a level of quality, perhaps, but that bar is pretty low.
While the bar exam enforces a certain quality of knowledge, it also enforces breadth. Every lawyer in a given jurisdiction takes the same bar exam, and it covers all aspects of the law.
The programming equivalent would be like forcing coders to demonstrate competency in everything from embedded systems programming to OS development to packaged application to web back-end, to web front-end, mobile apps, algorithms, networking, language design, compiler optimization, etc.
The way it works now, coders can specialize and that makes it easier to self-study. Learning all about web front-end dev is a lot easier then learning about every type of programming there is.
If you want credentialling, it would have to be more like medicine, where people have to pass the boards in their specialty only.
I think you've fallen victim to poe's law. In fairness the parent comment gave few enough hints at satire that I'll need to see the author confirm they did or didn't mean it before I'm convinced either way.
Thank you for the feedback. I was attempting sarcasm.
My hope was that the line "It's the only way we can reduce the number of data breaches and crashes" would seem immediately ridiculous to everyone.
Given the history of surprising security bugs in mature code being discovered by a decentralized collection of users and hackers over long periods of time -- a phenomenon we've all witnessed over the years -- I figured the notion that a slow-moving, bureaucratic supreme panel of code judges effectively and decisively removing security problems basically didn't make sense.
I don't know now. Maybe it makes more sense than I thought.
I think most programmers today take it on faith that the lightly regulated, free wheeling nature of programming is an inherent, unchangeable, and desirable state.
But if you look back at the history of highly-regulated technology industries, every single one started out lightly regulated and free wheeling. Early railroad, automobile, oil drilling, aviation, and telecommunication companies (to name a few) were all full of self-taught innovators who wanted to move fast and disrupt things.
As software eats the world, the world will become less and less tolerant of shitty software. I will not be surprised at all to see demand for credentialling and oversight grow even in the next 10 years.
The fact is not every piece of software needs to be completely secure from every possible data breach. Yes, there are cases where more security is important. It's also important to note that many security breaches are with systems that were designed, developed and built by people who would have passed your arbitrary "Bar Exam"... If you look at breaches surrounding OpenSSL, Windows and the like.
Most of these systems weren't developed by guys hacking away at PHP and Wordpress... Also, Nobody is going to die because your home computer was botted.
While I agree there should be more to professional software development than what we have today, I'd suggest that a guild system by reputation would be more appropriate... You only gain reputation by being backed by others who are considered trusted in turn. If you are shamed/shunned/fired for incompetance, then those who backed you also lose reputation.
In the end, nobody is going to support such a system. We're just about the last of the higher payed professional white/blue collar fields left. Why, because we negotiate pay. Going to a federated system would only serve to drive down pay, and introduce fees to an abstract organization that does very little good. I would also postulate that hiring a great programmer isn't any harder than finding/hiring a great lawyer.
In the end, it comes down to need, understanding, honesty and communication. I appreciate that we're in a field where someone without a degree (myself) has done some serious professional work in government, aerospace, education, security and financial industries.
If the compiler is the bar, the machine code would be the supreme court. You launch an appeal by looking the assembly produced by the compiler, finding errors, and run it by the SCOTCPU, and see if it agrees with you, or the compiler.
Though for applying for new programming jobs, it all comes down to algorithms and data structs for learning. I think a lot of the neat stuff programmers learn or do on their own is completely useless for whiteboard interviewing questions. "Look at this open source library I created and everyone uses!" seems to be useless for programming interviews.
To provide a counter argument, because I do not think the attitude should be "I'm not going to do open source because it is useless for programming interviews". Creating an open source library that others use is one of the strongest signals I can think of.
Things that are involved with creating an open source library that I can think of off the top of my head:
- Set up a build system
- Use a test suite
- Choose and follow a coding standard
- Issue tracking
- Bug report system
- Source control (good branching practices/ whatever best practices
your favorite source control promotes)
- Create a specification / roadmap (months of coding can save hours of planning)
- Documentation
a) Determining convention
b) Actually creating clear documentation
c) Updating documentation through changes
- Writing code that just works
And that is all just by yourself. Then if you have users and are building a community
- Communicate architecture to random strangers using
specifications that they understand
- Collaborate to develop roadmap /architecture with input
from other developers
- Communicate in a way as to not piss off strange
developers who decide to help you for free
The understanding of data structures and algorithms are only a small part of the puzzle. If a programmer can demonstrate all of those industry standard behaviors on their own time, then I don't care that much if they don't remember what a skip list or some other crap like that is. You can teach a developer all of the special data structures that your code base uses in a week. On the other hand it can take from months to never of steeping time in order to get developers up to speed with all of the house-keeping I described above.
TL;DR
Contribute to open source, start your own open source project because the skills required cover 90% of what programmers need to know. The CS theory stuff is the smallest and easiest part to train up. Show that you know how to execute well on the 90%, and companies will fall over themselves to train you on their own dime for the tiny subset of CS theory that they use.
I don't think law school can be fairly characterized as not being self-study. Admittedly, I only went for one year before deciding I did not want to go that route, but it was more like: "Here, go teach yourself this stack of cases tonight. I am going to rip your self-study to shreds in the morning."
There are degrees of "self-taught". There are programmers who majored in CS and stay up on the field, programmers who majored in math and worked their way through the algorithms book, programmers with no degree at all. As a math major, I guess I'll just punt and say the relationship between this and self-directed learning in law school "is left as an exercise for the reader". I do tend to agree with you that all learning ultimately is self directed, though again, that takes different forms.
Here's the thing - if formally enrolling in law school can't be characterized as "not self study", what can be? We'd be striking the phrase from the language at that point. So while I agree that all learning is self-directed to an extent, I do think it's reasonable to say that programmers do tend to be more "self-taught" than lawyers.
Oh, definitely coders are more self-taught, if we want to jump to that question. I was pointing out that law school is more of a progression through a self-taught curriculum than other formal educational pursuits. (After which you get to self-study for the bar exam.)
In context of the original article, being mentored by a practicing lawyer is likely to be similar to law school in the amount of self-study, so the idea that programmers would prefer one mechanism over another due solely to self-study is not a premise that I agree with.
"I really do think that software developers may be unusually well prepared to study law this way, because developers really are accustomed to massive amounts of self-directed learning.
"
Except the learning you do has massive amounts of feedback.
You try stuff, it works, you try other stuff, it doesn't.
You won't get that in a law office study program.
Another thing which might happen is software engineers automating away a lot of the law profession. Most of thexe exams are just memorizing, a system which is a remnant of the 1700s and 1800s. Not fit for an age where knowledge is literally at our finger tips.
Lawyer here. Memorization alone won't make you a lawyer, just as memorizing language syntax and API specs won't make you a programmer. You have to be able to apply principles in odd ways -- in software terms, edge cases and corner cases.
As Justice Holmes famously said, the practice of law is ultimately about predicting what a judge is likely to do. [1] That requires understanding judges' motivations, which include, for example, (A) doing what seems to be the "right" thing; (B) not suffering the professional embarrassment of being reversed on appeal; and, (C) not doing long-term damage to society with a bad decision, a canonical example of which is the Dred Scott Case [2].
Finally, really good lawyers not only try to predict what judges will do, they also grasp the motivations of their clients and of their clients' colleagues, collaborators, and adversaries.
I've long thought that the practice of law is like being a weather forecaster: You have to understand how the different atmospheric phenomena are likely to interact, and to make your best guess as to what's going to happen when a high-pressure system collides with an upper-atmosphere disturbance, etc. (I have no idea whether what I just said makes sense meteorologically).
What a cluster.. see kay. IANAL, and find the above description of what makes a good lawyer as repulsive as it is accurate.
It's a self-fulfilling system, too, thanks to the learning of case law. Law propagates more law. This is why we have legalese. This is why we can't have nice things!
That's a weird thing to be alarmed by, because that description of what makes a good lawyer (maybe "effective lawyer" is the better term) rings extremely true.
In every formal legal dispute I've been in, the lawyers on both sides seemed essentially to function like professional negotiators. The fine points of the law itself had little to do with the actual work being done in resolving the dispute.
>>> What a cluster.. see kay. ... Law propagates more law.
Law is a kind of emergent system. As Justice Holmes famously said, "The life of the law has not been logic: it has been experience." [1]
>>> That is why we have legalese.
The term legalese covers a lot of ground. We need to distinguish between the different genera (TIL that's the plural of genus).
Some legalese is simply a compact system of shorthand. Such systems evolve in every field; for example, the shorthand terms edge case and corner case in software. Busy professionals expect their colleagues to know the shorthand.
Some legalese, especially the archaic kind, can be used in an attempt to mystify and awe nonlawyers, especially less-educated ones. Good lawyers try to avoid doing that.
(Footnote: In bar-association circles there's been a persistent move toward plain language; try a Google search for "plain language vs. legalese.")
>>> This is why we can't have nice things!
Law evolved as an alternative to the rule of the strongest. On the whole, law is sustainable and scalable, more or less. We imperfect humans actually follow it, mostly, notwithstanding our tendency to pursue our individual desires.
If you think you know of a better system, please feel free to run for office and try to persuade the rest of us to implement it. And remember the brute fact that we humans are loss-averse [2]; that's one big reason why persuading people to go along with political reform can be so difficult.
(You could always try the Daesh approach [3], but that's likely to get you locked up or killed, and it'd be like trying to rewrite your code from scratch [4] --- you'd lose so much codified knowledge of workable ways of handling edge cases and corner cases.)
By your logic, if I ace the ground school exam, then I'm ready to be a pilot.
It's going to be long time before a computer can write as well as an average lawyer. And, when that day comes, you'll find that just as many programming jobs have been automated away as law jobs.
Law school exams are about recognizing and discussing ambiguity, not about memorizing rules. The people who think it's about memorizing get Bs, the ones who realize it's about ambiguity (while also decently memorizing the important details) get As.
I really do think that software developers may be unusually well prepared to study law this way, because developers really are accustomed to massive amounts of self-directed learning.