Hacker News new | past | comments | ask | show | jobs | submit login

I'm considering getting a masters or PhD (in PL) under a professor I work with now for my undergrad thesis. It has been my observation that the standard path of getting a standard corporate job tends to nullify all impact you could have (with a few rare exceptions). And after that I could get a job, become a professor, turn my research into a startup etc. The pros are

1) I know my professor and he's a solid guy

2) Pays decently well, money isn't too much of a concern

3) I get paid to do research, university provides generous grants if turned into a startup

Cons

1) Hear a lot of bad things about the academic rat race, pressure to public even at masters/PhD level

2) I could probably hack out some paper into journals but whether I could have any real impact "on demand" (versus say spontaneously coming up with something) is a big question mark, especially within the deadlines given in the program

Any thoughts on this? Especially heuristics, methods or ways to increase impact?




Understand that a PhD is an apprenticeship to become a researcher. You are not expected to do career defining work as a PhD student, and indeed that is unlikely.

Your relationship with your advisor is very important. It seems like you already have that sorted out.

Most successful PhDs (in CS) involve tackling a relatlively small and easy project, usually suggested by your advisor, early on, and then expanding and iterating on this. Once you make some progress on a topic you'll easily find more directions to take it.

Working with other people is one of the easiest ways to increase your productivity. All the great groups I saw had a lot of collaboration. Don't fall into the "lone scholar locked in the library" stereotype.

Avoid bad people. Avoid getting stuck in your own head. Realize a PhD is a project like many others. It doesn't define you. You start it, you work consistently on it, you finish it.

Doing a research Masters is usually a waste of time. Doing a taught Masters is a lot of fun, but something quite different to a PhD.


Thanks for the reply!

>A PhD is an apprenticeship to become a researcher That's a good way to look at it. I suspect one of the biggest possible benefits of a PhD is that you're put in an environment structured to and pressuring you to develop something new, which is the opposite of most other human work.

>Start a relatively small and easy project and collaborate Sound advice, it's the general approach I've taken for my undergrad thesis.

>A research masters is a waste of time, a coursework masters isn't Really? It looks the opposite to me. A research masters let's you collaborate with different people and work on new things. A coursework masters is taking advanced classes.


At least in the UK, a PhD takes one year more than a MRes and it lets you become a university lecturer. It also should be funded, whilst you might have to pay your own way for an MRes. Hence I don't see the point of doing an MRes when you can stay an extra year and have more opportunities afterwards. MRes are usually a consolation prize for people who drop out of their PhD, IME.


In my side of the world I think it's more similar to the USA where a masters is two years and a PhD is four. And it's fifty fifty whether a PhD has a masters or comes straight from undergrad. I'm leaning towards masters because I don't particularly care about the prestige and I don't want to over commit. I don't think the title is important to the impact I have


As a professor many years after the PhD my advice is to do the PhD only if you are genuinely excited / cannot stop yourself from doing research. Only then it will outweigh the negatives of difficulties of getting the jobs, somewhat low pay etc. At least from my point of view I always tried to work on what was interesting to me and what I was good at/or it was interesting to learn vs optimising what is more high profile/sexy. I don't think it is a universal advice but at least I always enjoyed what I did


I can only second this after having advised a few students from bachelor to PhD level. The ones who do well are (usually) the ones who are genuinely excited. Not only about the thing they're doing, but in general. It really helps getting over the lows.

Furthermore, do not underestimate the importance of sheer luck. Exaggerating a bit, deep learning was just another subfield of ML, until GPU-powered DL really took off and made the researches behind the most fundamental ideas superstars. This is not a given, and it might take years or decades until it's really clear whether you're making an impact or not.

I wish you the best of luck, InkCanon, and stay excited!


Thanks!


What do you mean by cannot stop doing research? I certainly haven't discovered anything new, but I love learning new things, reading about ideas, coming up with them.


I meant that you tend to spend free time on that as opposed to treating it like 9 to 5 job. and again it is important that if you do that, it is because you just want to see what comes out of your experiment/learn a new thing etc rather than because you have to publish or is forced by your advisor.


I see. I think I definitely lean towards that.


You'll do great. This will eventually turn into new discoveries if you keep at it.


That is the hope!


Why not just study a bunch of different things to Master's level then? Learning something genuinely new seems like it has a much lower return to effort.


Good question. IMO

1) There's a kind of "hard" learning you're learning a fixed, structured way from a textbook.

2) There's a kind of "soft" learning which is transmission of knowledge, which happens a lot more face to face when you're working together.

3) Then there's a kind of research learning, where you're doing something new, usually with collaborators.

The second and third are really best done in certain environments like research or good companies


>getting a standard corporate job tends to nullify all impact you could have

It's very strange to me that you think other people would pay you millions or tens of millions of dollars over an average 30- or 40-year career, without you generating at least that amount of value back to the external world as a whole, and probably generating some huge multiple more, and yet all that counts as "no impact" to you. Especially when your comparison point isn't oncology or something, but doing research in PL theory of all things.

But I thank you for giving me the opportunity to get a little riled up on a lazy Sunday morning, it's one of my favorite hobbies. My recommendation to you for "increasing [overall] impact" is to read https://80000hours.org/ and follow their advice, and for "increasing impact [in this niche I really care about]" is simply to be more bounded with your claims.


>people would pay millions over a career without you generating at least that value back

Some of it is empirical observation. I've seen many friends at big/elite tech firms get paid to do very little. Many claims online to that effect, although I weigh it lesser. And I think it's completely plausible. I think because of the exponential advancement of technology, huge accrual of capital and inability of human incentive structures to keep up, value does not universally equal money. IMO many examples. Many people are tech firms do things that are very loosely related to revenue generation - so you can almost double your headcount during COVID, fire tons of them and still function the same (a substantial amount of hiring and firing was tech companies FOMOing about each othe). Meta's VR division has burned through $50 billion, but it's people got paid incredible salaries. One in three Nvidia employees are now worth over 20 million. Many of them were working decent jobs making GPUs for video games and suddenly because of AI, their net worth went up 100x. Oncology is another possible example. By far the wealthy people today are all in computers, instead of curing cancer.

I'm not saying these people are bad or anything like that. The other part of the equation, wealth as a signal, has become incredibly noisy. In some areas it is still a strong signal, typically smaller companies and startups where providing value is a lot more closely related to what you make. And conversely, I don't agree with money generated being a signal of impact in itself.


>I've seen many friends at big/elite tech firms get paid to do very little.

What matters is the outcome, not the amount of effort one puts in. If you're working at e.g. Google for $200,000 a year, your changes can affect millions to billions of people. At that scale even a small improvement like making Google Sheets load 1% faster can equate to millions of dollars of additional revenue downstream -- and likely tens of millions of dollars of actual value, since the largest tech companies actually capture only a low percentage of the value they create for their consumers.

You've just justified that $200k several times over for what might amount to two or three day's worth of effort for you, that's true. That's not a bug - that's a feature of working in a successful, scalable business. If you're inclined to do more than this "bare minimum" which you observe so many doing, just imagine how much value you could create for others if you actually worked close to your capacity in such a place.

>[B]ecause of the exponential advancement of technology, huge accrual of capital and inability of human incentive structures to keep up, value does not universally equal money.

I don't understand the thread of logic here. Claiming that human incentive structures are "unable to keep up" with value creation suggests to me that money is, if anything, a heavily lagging indicator of the real value one is generating, which is in line with the point above. But I don't think that is the point you are trying to make.

>Meta's VR division has burned through $50 billion, but it's people got paid incredible salaries.

Most company actions are bets that the company's leadership think are net positive. Sometimes those bets don't pan out the way we expect them to - that's normal. Your own research might take longer than you expect it to, but that in itself isn't a reason to look back and say you made a bad bet.

As for the people, yes, you generally have to pay a lot to get top talent, and even that doesn't assure you of success. That's probably 2-4 years, out of a 30- or 40-year career, where their contributions may have been net negative to the bottom line. Maybe. If we include caveats like "Meta VR never becomes profitable in the future, either" and "none of the innovations from Meta VR turn out to be profit-generating via a different, unexpected mechanism". This probably equalizes out over the course of a career for the vast majority of these engineers. Not exactly a ship sinker.

>One in three Nvidia employees are now worth over 20 million. Many of them were working decent jobs making GPUs for video games and suddenly because of AI, their net worth went up 100x.

AI is hugely, hugely useful for all kinds of people. I use it every day both professionally and personally. Almost everyone I know does the same. If you truly derive no value at all from it, you are decidedly in the minority.

Is the claim here that they shouldn't have made money off of helping to manufacture the hardware that enables this invention which so many have found so enormously useful? Or maybe it's that since they never intended for their hardware to be useful for such a thing, their involvement should be worth less. That sounds way more like trying to invent a human incentive structure that can't keep up with the exponential advancement of technology than what we actually have. The current incentive structure, however, is wonderfully open to serendipity like this.

>The other part of the equation, wealth as a signal, has become incredibly noisy.

You've just given two examples where one company's wealth fell up to $50b because they made a bet on something that (for now) nobody wants, and another company's wealth went so high that a plurality of their employees are now millionaires because they made something everyone wants. That doesn't sound like a low signal-to-noise ratio to me.


>What matters is the outcome, not the effort

>At certain companies the scale could be enormous

The latter is true and I think the most legitimate reason for working at big companies. I should specify in the first they also accomplish little and affect very little. Things like internal tools that went nowhere, running basic ETL scripts, things like updating financial trade systems to comply with some new regulation. And this at a pretty slow pace.

My meaning about Nvidia and Meta VR is how people who didn't create value got enormously wealthy anyway. In Nvidia's case, traditional GPU teams (which I suspect received most of the benefit because they've vested the longest and made up most of Nvidia's pre AI boom) got hugely rewarded by data center GPUs, which they played little role in. Conversely Meta's VR team still got paid really well (their stock is even up because of AI hype, despite VR losses) despite their failure. So you have these systems where even if you fail or don't play any role in success, you're still paid enormously well. This is because companies capture the value, then distribute in their very imperfect ways.

You're right that the valid reason for this is that tech companies act as risk absorbing entities by paying people to take high risk bets. But the necessary condition for these are

1) Hiring really good people (not just intelligent, but really motivated, curious, visionary etc)

2) A culture which promotes that

The on the ground reality of 1) is that it's a huge mess. The system has turned into a giant game. There are entire paid courses advertised to get a job in MAANG. The average entrant to MAANG spends six to eight months somersaulting through leetcode questions, making LinkedIn/Twitter/YouTube clones, doing interview prep, etc etc. Many causes for this, including the bureaucratization of tech companies, massive supply of fresh grads, global economic disparities, etc. It's no longer the days when nerds, hackers and other thinkers drifted together.

2) Because of FOMO, AI hype and frankly general lack of vision from many tech CEOs, it's just a mess. Anything AI is thrown piles of money at (hence the slew of ridiculous AI products). Everything else is under heavy pressure to AI-ify. I've heard from people inside Google has really ended that kind of research culture that produced all the great inventions. There are still great people and teams but increasingly isolated.


> Hear a lot of bad things about the academic rat race, pressure to public even at masters/PhD level

Strongly depends on the advisor and your goals. If you want to stay in academia, some amount of publications is required. Your advisor, especially if he pays your salary, may also push you to publish. If both are not an issue, I guess you can even finish without publications.

> I could probably hack out some paper into journals but whether I could have any real impact "on demand" (versus say spontaneously coming up with something) is a big question mark, especially within the deadlines given in the program

Nobody comes up with good ideas on demand. As you progress in your academic career the rate of ideas (theoretically) grows. That's why you need the advisor: he can come up with ideas at rate sufficient for his students


>advisor might push to publish

That's fair. I'm just cautiously eyeing the likelihood of coming up with something publishable that's not a going through the motions kind of thing.


> The main reason being getting a standard corporate job tends to nullify all impact you could have (with a few rare exceptions).

"Impact" is an ambiguous term, so it's quite vague what you mean. I assume "positive impact on the world and knowledge".

While this mantra is indeed motivational, it can set you up for disappointment, both in corporate as well as research/PhD settings, at the moment you realize how many hurdles there can be (toxic colleagues, bureaucracy, ignorance, etc.).

Also, for this interpretation of "impact", a corporate job can be very impactful as well.


>impact is ambiguous

This is the core of the issue (most replies usually involve some slightly different definitions). I take many definitions of impact, including societal use, contributing to knowledge, etc. But it's much clearer there are many things people do that are low impact, especially in places with a lot of bureaucracy, politics etc.

A corporate job can, but it seems to me as a result of various incentives corporate jobs tend to be compartmentalized, low impact and repetitive. We're also at a down cycle where tech, the historical haven for impact in a job, is scaling back a ton of things to focus on stock prices. If you know of any corporate jobs that do have impact by some definition of it, I'd love to hear it. In my experience these have been mostly startups.


Ask yourself this, has there been any useful Programming Language that has come out of PL research/ Academia in the last 20 years? The only example I can think of is Julia, and it only seems to be used by other academics.

If you’re looking to be impactful, you are much better off joining a job and working in your free time, than doing a PhD. A PhD is a program to compete for academic prestige. Grad students want to publish papers that get them noticed at conferences, invited to talks at prestigious universities etc, those are the incentives, always has been in academia. The brightest minds join academia because they care more about prestige than money (as they should, anyone can earn money, few can win a Nobel prize). In a healthy academic system, prestige is linked to real world societal impact. That is still somewhat true in fields like Machine Learning, in some fields it seems to be completely dis-aligned from any real world impact whatsoever (which seems to be PL research). Our academic system unfortunately is a rotten carcass.

You could still, advisor willing, do research that interests you and not care at all whether you get noticed by conferences/ journals, your peers etc. But that takes a certain level of anti-social behavior that very few humans possess and so I say join a job. Plenty of companies are still building programming languages, like Google, Apple etc which are being used by engineers worldwide and if you finagle your way into a job at those teams, you will have a meaningful, impactful job, which is also well paying as a side bonus.


> has there been any useful Programming Language that has come out of PL research/ Academia in the last 20 years?

The goal of PL research is not, usually, to produce languages that see commercial adoption but to produce ideas that industry adopts. You cannot say a language like Rust is not influenced by PL research.


No, I can very strongly claim that I doubt any of the modern languages like Rust, Go etc have been influenced by the trainwreck, that is programming language research.

PL research today is actually the study of something called “type theory,” whose relation to the act of building programming languages is the same relation a math PhD has to a carpenter. You will be a great mathematician if you do PL research but I would prefer if you do it in the maths department and not con us into believing it has something to do with programming languages. This is apparently what undergrads are taught in a compilers course: https://x.com/deedydas/status/1846366712616403366 I rest my case. (imagine the grad course syllabus)

On the fringes, you might find research groups who are doing interesting useful stuff in programming languages, but that is an exception to the rule. Which is probably why, you never hear any of the new language developers ever cite programming language research.


There is much more to PL research than "type theory". Look for instance at POPL 2024 program [1].

Also Rust has been influenced by type theory. Rust first compiler was written in OCaml and the influence of OCaml/Haskell (and many other languages [2]) is pretty clear.

Goal of PL research isn't to design programming languages but academic research has a lot of influence on programming languages.

[1] https://popl24.sigplan.org/program/program-POPL-2024/ [2] https://news.ycombinator.com/item?id=34704772)

Edit: regarding https://x.com/deedydas/status/1846366712616403366?mx=2 these are just the formal specs of a type checker. Nothing magic or very complicated there, it's just a notation. Anyone who can understand and implement a type checker should be able to understand this notation as well.


The creator of Rust in his own words:

“ Introducing: Rust Rust is a language that mostly cribs from past languages. Nothing new. Unapologetic interest in the static, structured, concurrent, large-systems language niche, Not for scripting, prototyping, or casual hacking, Not for research or exploring a new type system, Concentrates on known ways of achieving: More safety, More concurrency, Less mess, Nothing new? Hardly anything. Maybe a keyword or two, Many older languages better than newer ones: e.g., Mesa (1977), BETA (1975), CLU (1974) … We keep forgetting already-learned lessons., Rust picks from 80s/early 90s languages: Nil (1981), Hermes (1990), Erlang (1987), Sather (1990), Newsqueak (1988), Alef (1995), Limbo (1996), Napier (1985, 1988).”

If modern PL research is trying to take credit for the latest hot programming language (which I doubt they are, it’s only internet commentators who have nothing to do with PL research who argue with me. Actual PL researchers don’t care about Rust), they should be embarrassed.

Thank you for linking latest PL research, it has been a while since I’ve gone through it, glad to see nothing has changed. Ask yourself, how many of those talks in day 1, have accompanying code? is it even 25%?

For giggles I decided to peruse through “Normal bisimulations by Value”. A 54 page dense paper with theorems, equations and lemmas. Lol, what are we even doing here? You can also notice that they don’t bother justifying their research in the intro or the abstract, claiming relevance to any actual programming language. They themselves realize it’s just math, and PL researchers has become a euphemism for math. Frankly, even one such paper being accepted to a PL conference tells me something is going awry in the field, but if a majority of papers are like this, then the field is a wasteland, that only serves to grind young talented minds into spending their lives chasing academic prestige with no value to society.


> Ask yourself, how many of those talks in day 1, have accompanying code? is it even 25%?

57 out of 93 papers (61%) published at POPL 24 have an artifact available. Note that this may also be automated proofs etc, it's not necessarily "running code".

But I also think focusing on POPL as a representation of the PL community isn't entirely fair. POPL is the primary conference focused on type systems within the PL community. It's a niche within a niche. Conferences like OOPSLA, ECOOP, or ICFP are much broader and much less likely to be so focused on mathematical proofs.

[1] https://dl.acm.org/toc/pacmpl/2024/8/POPL


I asked Claude to go through all paper names and estimate how many have code vs how many are proofs:

“Based on my analysis, I estimate: - ~35-40 papers (roughly 35%) likely have significant accompanying code - ~55-60 papers (roughly 65%) are primarily theoretical/mathematical proofs “

I suspect even the remaining 35% doesn’t have much to do with programming languages, and I don’t think these stats change much for other conferences.


> I don’t think these stats change much for other conferences.

I'd severely doubt that: there is a large difference in focus on theory vs practice between conferences. POPL really is one of the more theoretical conferences. At a conference like ECOOP, you're unlikely to see many proofs (I'd guess at most 20% of papers, based on personal experience).


I did the same thing for ECOOP 2024: https://2024.ecoop.org/track/ecoop-2024-papers#program

Claude estimates 10 papers related programming languages and its features and 27 related to theory, verification etc.


One sub-field of PL research is the ability to formally specify programs, and thus understand their meaning and prove their correctness. A great project that is based on lots of theoretical foundations and practical implications is compcert [1]. They wrote a C compiler and proved that it translates C code to equivalent assembly code. You couldn't even state the problem without all the maths, let alone prove it. I'd argue that having correct compilers is worth the effort.

I assume "Normal bisimulations by Value" talks about equivalence relations between concurrent programs. If you want to prove correctness properties of concurrent programs or cryptographic protocols, this is one of the tools. It's not because there's no code and only maths that it's not relevant.

> Actual PL researchers don’t care about Rust

Not true, I just watched this video a few days ago about Rust semantics [2]. How would you prove that a Rust program making use of unsafe construct is actually safe? what does safe even mean? how to describe rigorously the behavior of the rust type checker? AFIAU there's not even an informal spec, let alone a formal one. How are you supposed to write correct program or compiler if the language isn't specified?

> Rust is a language that mostly cribs from past languages. Nothing new.

Doesn't mean that Rust isn't influenced by academic languages and ideas. Anybody who knows Haskell or OCaml see the direct influence.

Research isn't industry. A lot of what is produced may have no direct applications but may in the future. This is the point, it's research. Also it's not because you don't see the connections between research and application that they don't exist. Lots of people working on these industrial tools have an academic background and bring their knowledge into the equation.

> If modern PL research is trying to take credit for the latest hot programming language (which I doubt they are, it’s only internet commentators who have nothing to do with PL research who argue with me. Actual PL researchers don’t care about Rust), they should be embarrassed.

You're the one explaining that Rust didn't benefit from academic research which is obviously not the case.

[1] https://compcert.org [2] https://www.youtube.com/watch?v=9y1dLDnS4uE


Have you ever talked to the people who design those languages? Because they will disagree with you about as strongly. And, of course, they are in a position to be correct.


1. TypeScript (and Dart, which influenced it) would not exist without the research on gradual and optional typing. Many other of the type system features in TypeScript – like type inferencing, intersection and union types, and type-level programming (e.g. conditional types) – find their origin in PL research, and were uncommon in mainstream but common in academic programming languages before TypeScript appeared.

2. Similarly, mypy was created by Jukka Lehtosalo as part of his PhD [1] and part of a wave of research in applying gradual typing to dynamically typed programming languages.

3. Rust's ownership types and borrowing are based on PL research, such as linear logic / linear types. Same for traits. Early Rust even had typestates.

4. Several of the core developers of Rust, Go, TypeScript, C#, Dart, Scala, have a PhD in PL or a background in research.

5. Generics are another feature that was heavily researched in academia (admittedly a longer time ago) before becoming part of mainstream programming languages.

So I completely disagree with you: most modern languages have been heavily influenced by programming language research. In fact, I'd be hard-pressed to find a modern PL that hasn't been in some way influenced by PL research.

(One thing I agree with in your comment, is that current PL research focuses too heavily on type systems and should look more at other interesting PL features. My recommendation to InkCanon would therefore be to look broader than type systems. The problem with research on type systems is that, because it looks math-y, it feels more like "science" and hence "cures impostor syndrome". But cool stuff can be real science too!)

[1] https://mypy-lang.org/about.html


Ask yourself this, has there been any useful Programming Language that has come out of PL research/ Academia in the last 20 years?

Scala


There are several assumptions here tangled together here

1) Use is sufficient, but not necessary for impact. Theory of relativity, a lot of QM, etc has had only uses in real world edge cases, but have enormous value. The value function for impact, so to speak, includes more than just use.

2) There is the structure of academia and it's incentives, the average behavior of people in it, and it's outcomes. I don't necessarily have to bow to it's incentives, nor behave like the average person in there. Academia is also sufficiently large and fractal that you can find people less interested in the incentives and more in some thing they obsesses about.

PL has had some interesting, although sometimes unheard of, real world uses. CUDA for example. A significant chunk of PL now focuses on ML. Awhile back a company called Monoidics got acquired my Facebook for work on static bug finding with formal methods. Rust has been pretty influenced by PL concepts. New languages like WASM are formally verified from the ground up, and there are exciting opportunities for that.

I have considered slinging my resume to more research oriented companies, but hearsay from people is that the golden age is over. Under FOMO and stock market pressure, these companies are eradicating the kind of freewheeling research they used to and dumping money into ML and ML hardware. Not to mention it's a bit of a dice roll and a circus to get a job at such companies nowadays as a fresh graduate.


The only group under low pressure, free to do anything they want in Academia are tenured profs who have established themselves well or grad students who don’t care at all about remaining in Academia (and presumably have NSF fellowship or similar so that they don’t need to listen to their advisor either). Everyone else needs to grind, profs without tenure are arguably under the highest pressure, PhDs who want to stay in academia have high pressure etc. If your prof is tenured but not established and is struggling for grant money, even he is under high pressure to publish and win grants, something I learnt the hard way. The grant acceptance rate is what, like 20% these days.

My 2 cents, you will be more likely to encounter creative coders who are passionate about a field in the right industry team than Academia. Unfortunately getting into the right industry team is also a grind, and you likely won’t get there right out of undergrad, but within 10 years, if you put effort and grind, you can get there. I think it’s better odds and more fun than going to academia, but your mileage may vary.


> A PhD is a program to compete for academic prestige

That's true for some people but others have different motivations, such as learning useful skills so they can gain the ability to work on interesting problems in a given field.

Doing a PhD in PL can also help you get the kind of jobs you mentioned, and achieve more once you're there. For me, the most valuable I thing I got out of the process was extensive exposure to the literature, which has been useful in a range of contexts.


> And after that I could get a job, become a professor, turn my research into a startup etc.

chances of getting professor-ship, tenure, or even a post-doc is close to nil, due to extreme competition and limited seats. academia is the most slowly moving enterprise, some folks in their 80s still around, when young grads kicked out.

getting a job after PhD may also be very hard. you would be very over-qualified, likely huge ego, and very narrow skillset in your domain, that is likely lagging behind industry. managerial (or even just "work at corporate") skills will be lacking. unemployment of PhDs is wildly high, even higher than if you did not do PhD.

turning research into startup may be much harder than you would expect. milking government funding for years (and surviving jungle of academic politics to get its cut) is very different from market outside of it (i.e. venture capital, startups, tech, etc.), at the time you would want to make startup you would have to learn all from scratch, or even un-learn, as many would be detrimental.

then there is toxic academic culture (funding, publishing, power dynamics). and in recent years academia become pit of wild woke left agenda, even more oil on fire.

tbh, if you want to do something special, academia as we have it today is not the best place.

if you still want to do it, guess best strategy is to "do it quick and get out". some smart people I know doing exactly that. doing accelerated PhD asap and getting the hell out of academia. (but then, it depends all on your professor power dynamics. in some places they would not let you graduate unless they wish so.)


>getting a job in academia is really hard

In my side of the world its a little bit better, the CS department has plans to double headcount in the next few years. They've got whole new faculty apartment buildings set up and everything, and the funding situation is quite generous (I'm told). Although I have also read in the USA the bar for even stipend paying masters/PhDs has gotten incredibly high.

>Milking government funding for years

There are special programs for startup oriented funds, so it's more like VC pitching to academics with equity free grants (although naturally there's the whole university research IP issue). But I'm quite willing to put up with it to do something meaningful (at a decent number of jobs you put up with it just to keep your job). I do keep an ear on startup-y things, I don't think I'll have it any easier than an undergrad but I think I won't be too disadvantaged.

>Do it and get out

I don't place too much emphasis on the PhD per se but the real value of it.

>If you want to do something special academia is not the place

Ten years ago tech would've been a good place. But now especially for a new graduate its a bloodbath, not to mention there's been huge layoffs. Academia seems like the better option nowadays.


> changes of getting professor-ship, tenure, or even a post-doc is close to nil

What an extreme exaggeration. Yes academia is competitive, yes tenure is hard to get (obviously). But the chance is not "close to nil" for that at all, and it's certainly not "close to nil" for a postdoc lol.

> getting a job after PhD may also be very hard. you would be very over-qualified, likely huge ego, and very narrow skillset in your domain, that is likely lagging behind industry. managerial (or even just "work at corporate") skills will be lacking. unemployment of PhDs is wildly high, even higher than if you did not do PhD.

This is just not true x) There are no numbers where PhDs have worse unemployment than grads.

Conversely, it does open a lot of doors for industry jobs (think ML, quant finance, to name a few)


Getting a postdoc position is usually easier than getting into a PhD program in the same university, especially in top universities. And the chances of getting a tenure-track faculty position or similar are probably something like 1/3.

The biggest obstacles to getting an academic job are personal. The jobs are wherever they are, and your (or your partner's) preferences cannot change that. If you are willing to relocate, your chances of getting a good academic job are much higher than if you restrict your search to a single city / region / country / continent.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: