Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What skills do you think will be most in-demand in 10 years?
16 points by anbardoi 3 months ago | hide | past | favorite | 50 comments
I’m preparing to go to college, and would like to hear some people’s thoughts on what skills would be valuable to have in ten years, from an IT perspective. I’m initially majoring in computer science, and have thought of doing quantum computing electives/research opportunities. What do you think will be the skills to have in 10 years?



College curriculums are barely teaching skills that are directly relevant in the industry today, let alone 10 years from now, and that is by design. The best approach to your career as a student is to get a solid knowledge of the basics. Coding, data structures, algorithms, statistics, combinatorics, proofs, compiler design, computer architecture, cryptography, operating systems, databases. And work on the non-technical areas as well – teamwork, communication, technical writing.

All of this will provide a foundation that you can build on for the rest of your career, no matter which direction the industry happens to go in.


From an IT perspective, social skills. All the soft stuff that's impossible to measure. I'd put some more points into charisma because people aren't going away. gone are the days of being an eccentric genius who's good with computers. there's a less eccentric smart person who's just as good at computers, and they even take regular showers. being able to code is table stakes in today's competitive world, you need to be able to do far more to be competitive. So the skills to last are all based on an understanding of human psychology. If I sell you something at $5 that's one thing. But how about I tell you that it's normally is $10 and I'm giving you a deal because you're special. The technology stacks will wax and wane, any specific guess is going to look hilarious in 10 years. Typescript? Cuda? Linux? Who knows! What isn't going to change is change. Change is going to remain constant, so be prepared to keep learning new things in 20 years.


It's difficult to predict, but I don't see the classics going away: Java, JavaScript, SQL.

But overall, the goal should be to get a broad range of experience in different platforms so the technology you use is irrelevant. From that point on you just need to decide where you want to invest your time.

For example, when I was doing internships in college I invested in getting Java experience because I knew that it was used by large companies (companies who pay more). This paid off and I was hired by a company who paid me a lot of money to use Java straight out of college. Now I'm paid a lot of money by another company to use Java.

So you want to have the capability to code in anything, but you have to decide which technologies to put on your resume based on what's out there. Knowing Java, JS, and Web Application Development is never a bad idea.


SQL is a great one because I've found so many devs either know nothing, or so bare minimum that it's easy to become the SQL guy on a team just because you're familiar with window functions. There's a lot of SQL out there too, and nothing has dethroned it after probably near 50 years now and I don't see anything coming close.

It's also one of the best bang-for-your-buck things to learn, time to usefulness wise. Whether you're writing JS, Java, C#, Python, Ruby, etc and have a backing DB - SQL is probably associated with the project. When it comes to NoSQL, there's a lot of great stuff out there for specific situations, but the querying is either fairly simple (with complexity on the application code or infra), or the query languages are very specific and less applicable to other software.

The core of SQL is very versatile and it'll be there whether you're at a three person startup or Wells Fargo.


I addition, the industry has repeatedly attempted to kill SQL, but it lives on. The level of survivorship bias SQL has achieved is only matched by C.


Can you give some examples for "repeatedly tried to kill SQL"?

For me it's only KeyValue/NoSQL (which trades/focuses on some ACID traits over others) and various incarnations of ORMs (which try to normalize different SQLs and map it to your familiar OOP programming language) that come to mind.

However they are either orthogonal or complementary to SQL.


Funny that for you Java and JavaScript are classic.

After initial big corporate SOAP/servlets buzz in 1990s, Java itself was trending down for a while. Stuff we have now (mostly due to borrowing a lot of good things from C# and other even more "dynamic" languages) gave it second life.

Well also strong PR and push for it in academia (because there were many big corporate jobs for it). But without improvements it had (and continues to have) - it would've ended like Visual Basic.

And I'll leave JavaScript mini history lesson (or rant) to someone else


I've been thinking about this for a while. I mostly do js/ts[full stack]. But having C# or Java under my belt would probably be a good idea, those two aren't going away anytime soon. Plus docker and k8s.

Like you said, the classics aren't going anywhere


There will always be demand for people that can:

1. Solve problems.

2. Communicate complex issues in a way everybody can understand.

3. Discover the real problem that is trying to be solved instead of the perceived problem or per-determined solution.

The technical tools at your disposal are constantly evolving but the principles remain the same.


These are the skills you need to do any tech job. But whether we like it or not, it's the bullet list of technology skills that get you to and through the interview. So I think it's worth discussing what tools and languages might be useful to learn for the future.


If you want a clear path avoid IT. The best advice is learn to learn and don't ever stop. In my over 40 years the only constant was change. Whatever you learn at college don't assume you will end up using it.


I have the opposite perspective from my 40 years - not about learning to learn - that is 100% correct. But change is not as constant as it seems. The surface levels churns and change, but the fundamentals change very slowly. Hardware, networking, and those low layers of the OSI stack are not exactly the same as back then... but close enough. If you understood it then, you understand it now even if some details have evolved. Likewise, the UX of OSes has changed, as have the details of the internals, but not the concept of file systems and command lines.

Picking up new details is easier than evolving your understanding to a completely new paradigm, so if someone wants to learn something today and have it be relevant in 10 years, just work lower and lower in the OSI model.


While I'm only 40 y/o - I started with computers really early (Amstrad CPC 6128, later Amiga before PCs).

And both of you seem correct.

Looking at short enough time window, and especially if you don't know those lower/deeper levels (concepts) - even more so if you focus on buzzword/trends - it does seem to change a lot.

With deeper and longer term context, and focus on concepts - IT is seemingly doing circles. From mainframes and thin clients to fat clients to big servers, to "serverless".

Last time I checked "serverless" it went from it's initial "So it's basically like CGI scripts - each request executes a program from scratch" to "And now it's faster with persistence" (so like FastCGI/Plack/etc).

Of course with orders of magnitudes improvements in memory size, execution & computation speed and bandwith/latency...

Every now and then, some things which were impractical/gimmicky and maybe even not possible (e.g. text to speech, sound/image to text, 3D/VR/raytracing, ML/AI ...) finally become possible.

And on occasion the sum of those "new but theoretically/conceptually old" things unlock something that's really kind of new - like video deepfakes.

To me personally - ChatGPT and similar still seem like just much less (but still a bit) gimmicky variant of those end of 1990s IRC and early 2000s phpBB chat bots.


Honestly, the most important skill is to learn how to learn. There are some basics that will always be useful such as SQL or any of the popular high level programming languages. But in 10 years, the demands of whatever sector you want to work in will have changed in ways that we can only guess. But learning how you learn, what techniques help you stay focused are much more useful in the long term.

If you are serious on going into research, seek out (abstract) mathematics. It's a language in itself and (in my experience) takes the longest to become comfortable with.


I'd doubly emphasize this. Technology changes rapidly regardless of bubbles, its better to learn how to pick up skills quickly and grok (deeply understand) them.

I found these two courses to be really good foundations for kicking off in my own reeducation post-bachelors (in CS). They filled in blanks and reinvigorated my internal monologue of "yes i can do/learn that" growth mindset.

https://www.coursera.org/learn/learning-how-to-learn

https://www.coursera.org/learn/mathematical-thinking


Thanks for this resource! And the advice! Thanks parent comment for your advice too!


Communication and marketing. Nobody really knows what will be important on the technical side, especially as 10 years is a good timeframe for disruptive changes from AI. But being able to communicate well, with other humans, with yourself, and maybe even AIs, and marketing yourself on at least the basic level, are skills, which are very important in any business.


Sure, I follow you. But you need to actually have some hard skills in order for those to have any value, right?


Yes, but hard skills are not fixed. They change with company, project, country, industry sector, time and opportunity, and constantly evolve and swing around. And that's even ignoring your personal ability to be good in a specific topic. That is why nobody can tell you what will be popular in 10 years or what will work out for you.

That is why you should have a solid foundation on general skills, and be able to move fast in whatever direction opportunity arise. And this is also where communication will help you, because it will teach you to listen, to understand people and their problems, to find opportunity and maybe also how to take them for you.


Good old fashioned server wrangling, since the pendulum will be back in full swing from “cloud” back to “on-prem”.


Why do you think this will occur?


Tbh, idk. Idk even what skills are the most in-demand right now, because everything is getting constantly and rapidly disrupted. Most likely in the next 10 years, skills which will be most demanded are skills which won't be easily replicated by AI.


If LLMs are in an early phase, in 10 years we would be in the declining phase (with better tech replacing it?). Something akin to React today. So I think going 100% in on LLMs is a risky but high profitable road to take.


LLMs run the risk of being captured by black box products that advertise themselves as being easy for non-tech people to use.

That doesn't make them a bad skill to have in ones toolbelt. Just keep in mind that companies love tech without techies and LLMs could be one of those tools where usage is bimodally distributed, with hardcore experts on one side building the tooling, and domain experts on the other side using the tooling; which may not leave much space in the middle-ground for someone knows a little bit of both sides.


I don't think math is going out of business


Seconded. Math does nothing for you by default but it makes almost everything else easier, to the point of making the impossible possible.


I have found several points in my career where taking the extra time to learn math fundamentals (particularly linear algebra) has paid off when learning something more high level. People often say that the math in machine learning doesn't matter and you can staple python libraries together but I've made convincing arguments in product design meetings based on how the mathematics of certain algorithms lends itself to our particular use case and how it helps scalability


Thirded. I just got back from a customer visit. They were trying to do something where the back of the napkin math said it would take a couple dozen CPU years to solve. I recognized a simple (in the math world) transformation we could do and the could run the whole problem in under 10 min.

Coding the transform isn’t trivial, so we’ll get some contract dollars to solve it, but it will still be done faster than the naive approach.

These are pretty sophisticated customers, but they don’t have a deep math background. Without that you wouldn’t find the “obvious” solution.


If you can recall it, do you remember the math problem in question? And could you loosely explain how you transformed it?(just with general concepts or keywords for my own curiosity. I love math, and seeing how different problems can relate to each other in unexpected ways. For example, when the sum of two quadratic roots gives the width of a given rectangle. Like this problem: A rectangle has an area of 32ft² . Its width is 4ft less than its length. What is the width? A = l(w) and 32 = l(l - 4). I’ll spare you the work shown, but l = 8 and l = -4. … (8) + (-4) = 4 = w


Roughly it can be viewed as a change of coordinates and then recognizing the symmetry. Say you had a 2D function of x and y and transformed it to r and theta and then noticed that it was independent of theta. You go from an 2D problem to a 1D one.

In this case we’re transforming a 7D problem into a sum of a discrete set of 2D problems. With more algebra we could get it down to a 1D problem but that would take more human work that wouldn’t be paid off later in CPU time. If their project ends up scaling and the 1D transformation makes sense, we’ll do it.


What field of mathematics is this?


Hard semiconductor R&D


foraging, skinning, herbalism, small engine repair…


Those ...?

Would those be stuff like carpentry/construction, animal husbandry, distilling alcohol, making "green" petrol/diesel (from vegetables), cooking gas (from food scraps and manure) ...

And anything and everything else that might be considered long term and post fall of civilization prepping?

At that point you might as well also add knowledge of some Germanic (that includes English), Slavic and CJK/Asian language... Though French, Spanish, Portugese and such could also come in handy.


Electronics, machining and manufacturing

Knowing how to actually apply computers directly to solve problems quickly is always quite valuable outside silicon valley


Social engineering, sales, marketing, being Sam Altman, history, maths and writing real good.


cyber security isn't going away anytime soon.


Which should include at minimum

  How to test backups 
  Unidirectional networks, and how to use them
  Microkernels, multilevel security and capabilities 
  Along with the popular answers


The best skill is understanding your limits.

Just because quantum research is in demand doesn't mean you will be in demand or successful.

Take Elon musk for example.

He knows a Stanford PhD is highly valued. But it only took him two days to realize that he doesn't have the IQ that match his classmates and he is not smart enough to be a leading researcher. He realized it early enough.

Had he decided to continue to stay at Stanford, he would have probably dropped out after 2-3 years feeling demoralized with imposter syndrome.

So the best skill I can think of is always knowing your limits.


foraging, herbalism, small engine repair… (whoops accidentally double posted)


You forgot beer brewing.


When do we start banning these questions?

It is literally asked and answered every week at least.

Not using search before asking shows disrespect to community, am I wrong?


Moderation is a safe job if that's what you mean, it's a safe job that not going away any time soon.


And good trolling skills never get out of fashion. And Python sucks, i would always prefer Java/Javascript over that slow crap.


well any given week the answer might be different, world is not a constant


Every small dead or dying forum I've ever been in pushes new contributors away with some version of "did you even try to use the search function???".

https://xkcd.com/1053/


This doesn't seem right to me -- for example, Stack Overflow seems to have far less instances of such gatekeeping than it used to, and seems less relevant now than it used to. The gatekeeping occurred before the pedants understood how broadly and differently the community was being utilized; and it didn't impact its growth.


Asking same question is not a contribution.

Every dying community I've ever seen was full of low quality and repetitive questions.


Every dying online community I've ever seen was full of posts asking for a ban on another type of post.


That's fair- low quality rebuttal on my part, sorry about that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: