Hacker News new | past | comments | ask | show | jobs | submit login
Thoughts on tech employment (ma.tt)
167 points by luu 9 months ago | hide | past | favorite | 244 comments



There are some obvious things missing from here.

I believe the prevailing wisdom is that tech hiring slowed because interest rates rose.

Because software scales so well, it benefits from speculative effort more than other business types. We see this in venture capital, where they only need 1 out of 100 bets to hit in order to make their money. Large tech companies do something similar internally. They may fund the development of 100 products or features, knowing they only need one of them to hit big in order to fund the company going forward.

When money was essentially free to borrow, it made all the sense in the world to make a large number of bets because the odds were on your side that at least one of them would pay off. Now, however, each bet comes with a real opportunity cost, so companies are making fewer speculative bets and thus need fewer people.

---

The other thing he doesn't talk about is the rise of remote work and the downward pressure that it puts on wages. I know that many companies are forcing employees to return to the office, but I'd speculate that the number of remote workers has risen significantly. And that opens up the labor market pretty significantly.

I'll tell you that I'm getting overseas talent for roles where 10 years ago I would have hired entry level talent in the US. But since my company is fully remote and distributed, the downside to hiring in LatAm and Eastern Europe has been significantly reduced.


>I'll tell you that I'm getting overseas talent for roles where 10 years ago I would have hired entry level talent in the US

the US is going to once again cripple one of their critical industries for short term gains, what happens when your senior engineers retire and there are no replacements prepared because it's more efficient to hire foreign senior engineers rather than onboarding and training a college grad? Even before AI improvements and Covid increasing remote work, the bar for hiring entry engineers was absurd and most startups only wanted seniors

I can't blame companies for doing this on the micro level because it makes sense, but macro level the US government needs to take action and treat hiring remote similarly to bringing in a visa worker or add a form of tariff like you would for manufactured products. Government's job is to reign in market forces that are net bad for the country due to unaccounted externalities


Wow, you are right that we are crippling a industry with short term goals, but is the other way around.

We should let them come work here. Each engineer carries a wealth of knowledge, enthusiasm and demand for other services that is very very valuable for the country.

We are all better off if we let every engineer that wants to come and work in the states.

Imposing tariffs on remote work would just incentivize outward migration of companies.

Don't make companies choose either the US or the world, instead make the US the obvious choice.


Have you worked closely with a remote work force from another country? I know a couple of dozen developers in Argentina who trivially rival any American counterpart and they wouldn't move here if we paid them to. Not everyone dreams of migrating to the USA.


I'm sure there's lots of people who wouldn't move to the States. But it's completely untrue to think that more people wouldn't move if given a path, or even if given an easier path.


> what happens when your senior engineers retire and there are no replacements prepared because it's more efficient to hire foreign senior engineers rather than onboarding and training a college grad

This assumes that the overseas engineers aren’t senior or reliable? I (in the US) work with a lot of talented and dedicated overseas folks who keep me on my toes. Some of them are founding or staff level engineers of our SF-based startup.


Most lived experiences with off-shore talent are due to labor costs. There are great offshore engineers but many work for companies who aren't hiring at the top end of the local market: they're hiring off-shore to save. You get what you pay for. And that leads to impressions, even if incorrect ones.


"You get what you pay for" is fair, but its also worth pointing out that in some places "money goes further".

In my city, I can go out, eat at a steakhouse, 3 courses, with wine, 2 people, and the total bill is $30-$40 total, not each). Nice sit down restaurant, good food, linen napkins.

Consequently highly skilled, senior engineers can be paid < $100k and still live like a king. If the exact same person lived in the US, or worse in an expensive part of the US, you'd pay more, probably 5 times more.

Once you embrace remote work (WFH) you quickly discover this very real geographical swing in value-of-money.

Of course -most- remote workers are crap. Most local workers are crap too. The remote-hiring problem is as hard as the local-hiring problem, probably harder. But the cost-savings are immense, and the long-term PR is significant. (Yeah, we're laying off 10% of support, but their all foreigners - kinda skips over the point that they're -all- foreigners to begin with)

You get what you okay for, but that bag of silver you have turns into a bag of gold elsewhere.


> the US is going to once again cripple one of their critical industries for short term gains, what happens when your senior engineers retire and there are no replacements prepared because it's more efficient to hire foreign senior engineers rather than onboarding and training a college grad? Even before AI improvements and Covid increasing remote work, the bar for hiring entry engineers was absurd and most startups only wanted seniors

Presumably you hire more senior engineers from overseas? What, do you think American senior engineers are inherently better somehow?


And then they leave and form better companies, because they understand how things work, and then US politicians are sad because their critical industries are gone.


>And then they leave and form better companies

As opposed to US employees - who are well known among the most loyal employees in the world?


This is a bad-faith response.


Or just let/require them to immigrate here, and get the best of both worlds.


The best of both worlds is living in western europe on a US-level salary


All of the problems people commonly complain about with regard to America vs Europe generally disappear when you're rich enough.


The traffic doesn't.


You get taxed ridiculously but can still afford private healthcare and air conditioning!


The tax really isnt all that different vs california. They just dont spend as much money on the military so go figure they can give their residents healthcare.


Damn bro, sounds like Europe has money to spare then. Guess the U.S should cut all spending in the area so Europeans can step up for themselves. I'd love to see the EU take the bill for policing their own back yard or trade routes.


Just do it already:

    the U.S. has about 750 military bases across 80 countries in the world and has deployed 173,000 troops in 159 different countries 
https://www.aljazeera.com/news/2021/9/10/infographic-us-mili...

there's no need to pretend the U.S. is doing it for the good of others.


I'd love to. The EU is free-riding on U.S's protection. At this point I don't care if Moscow is rolling tanks through Berlin. I would much rather be on good terms with Russia than prop up the rapidly declining EU while they act smug about their healthcare.


This is a regrettably shortsighted and myopic view of Russia and the EU.


Why western europe and not, say, eastern europe or latin america where cost of living is cheaper?


Because Western Europe is, generally, not as shitty as Eastern Europe or Latin America.

There is some tongue and cheek to that comment, but not much. You generally don't hear about waves of people wanting to emigrate from France or Austria to Poland or Latvia.


> Because Western Europe is, generally, not as shitty as Eastern Europe or Latin America.

> There is some tongue and cheek to that comment, but not much. You generally don't hear about waves of people wanting to emigrate from France or Austria to Poland or Latvia.

Latvian here.

Life is reasonably affordable, most of the services you'd expect are available in the capital or most cities. Internet is pretty good (I pay 15 EUR for 200 Mbps up and down, could get up to 1000 Mbps for twice that, though cheaper with some providers), food is okay, rent isn't too bad in most places, went through university (both Bachelor's and Master's degrees) with no debt, healthcare is pretty good (especially with insurance), salaries kind of suck though across the board. Public transportation is pretty good, both between cities and within the limits of a city, I really don't need a car.

People can be a bit colder, not really that American small talk, but are nice when you get to know them. Except for some kind of harmful cultural stuff (alcoholism, machismo, LOTS of xenophobia but perhaps due to the country's past, some homophobia/transphobia/etc., people being selfish sometimes), also can't really read news site comments here, most are very toxic, like you wouldn't believe.

The weather tends to be okay in the summer, but sucks pretty badly in the winter/autumn and sometimes spring. Cold, windy, lots of overcast clouds, overall a very depressing mood, sort of. Nature is pretty though, especially if you go on a boating trip down the rivers.

I reckon that other parts of Europe are probably better in many respects, but as with most places, there are positives and negatives.


as opposed to all polish and latvian people lining up at Austrain and French border… :)


Given the Schengen area, I'm not sure those four countries make for a good example of anybody waiting at borders in either direction.


Not sure if your comment was meant to be sarcastic, but basically, yes, ever since the EU instituted freedom of labor movement, and Eastern countries joined the EU, there have been overall huge amounts of labor migration from Eastern to Western bloc countries.


If you'd gently direct your eyes to Ukraine, you'll realize why Eastern Europe isn't exactly prime location material for a lot of folks right now.

Meso/South America has a stability problem as well. Unless you pick Urugay, I guess, but being surrounded by large countries with both problems and guns is never a fun exercise.


The quality of life is Eastern Europe (and some countries) in North of Europe is great (and even greater than France and many places), but the risk of war spreading among these European countries is a real issue :/


Oh yeah, QoL is pretty decent. The economy for most of Eastern Europe is great as well. It's a geopolitical question more than anything else right now. (Not that Western Europe is much better there, but the time scale for potential conflict is currently a bit longer)


Not many tech people wants to immigrate to USA. With 100k annual salary one can live as a king in LATAM or eastern Europe.

Overall from my point of view life in Europe is much better than in US.


There is objective data to either support or refute this claim. Look at immigration flows from LATAM the the US and compare it with the reverse. Then do the same with eastern Europe.


And then look at the millions of devs is LATAM who never went to USA or emigrated to the EU.


Exactly. Look at the net flows in and out of each location!


I agree we seem intent on crippling ourselves, but the inverse move would likely be better, which is give tax credits for hiring US citizens on US soil that offset any savings they'd get by hiring out of the US.

If you run the numbers, you can adjust individual income tax upward marginally across brackets to account for shortfalls, and it will be far less complicated to implement and enforce.

This gives an incentive to keep the dollars domestic vs abroad.

You may even want to give some additional tax incentives for companies that close their foreign offices to re-open them in the US. This would be similar to a scheme that helped with bringing some manufacturing back from overseas


> “treat hiring remote similarly to bringing in a visa worker or add a form of tariff like you would for manufactured products”

How would this work in practice? Visas and tariffs are enforced at the borders.

Where is the checkpoint for taxing foreign remote employees? Would Congress require Slack and GitHub to scan for evidence of work being performed by remote employees?

If you just make a law but have no enforcement mechanism, people will find workarounds because they only have to be justified internally.


>Where is the checkpoint for taxing foreign remote employees?

company payroll/tax reporting I'd assume. If payments are going to a worker in a foreign country, throw an additional tax on it to make it so there's no cost savings compared to hiring locally. That way the only reason to do it would be because you couldn't find the talent locally


The foreign employees are often employees of a foreign company, because it's more convenient that way. It's difficult to do business in a country if nobody around you understands the legal status of your company. That foreign company could be owned by the same holding company as the American company, or it could be technically independent. And if you impose tariffs on buying services from foreign companies, other countries will reciprocate, which will hit US tech companies hard.


That would be quite the tax. Developers in Europe and Asia make a LOT less than the US.


that's kind of the point, you want to eliminate companies just doing labor cost arbitrage rather than hiring globally because they actually need a unique talent they can't find in the US. If that's the case, they will happily pay the premium because that person still provides ROI even at US pay rates. That's supposed to be how H1B visas work as well


What's to stop Apple, for example, from setting up companies all over the world that employ their people and own their IP local to the employees? Then Apple US buys / licenses products for US distribution from those companies.

Massive companies have far more incentive to "engineer" their way around such laws than lawmakers have to find something without loopholes. Especially given how lobbying works in the US.


To "own" those companies Apple has to participate in trade agreements between both countries respective legal systems, and also know those agreements mean something.

If your foreign subsidiary is owned by citizens of the country it's in, then how exactly do "you" own it if your claim to ownership doesn't in fact depend on agreements between the respective governments?


They'll just open foreign subsidiaries to hire the devs, and the US company will pay it "license fees". If market forces are net bad for the country, country must improve its game and become more competitive.


> The other thing he doesn't talk about is the rise of remote work and the downward pressure that it puts on wages

This doesn't get nearly talked about as much and I feel the audience who is the loudest favoring remote work has more to benefit than just a reduced or eliminated commute. Rather, this puts people into job market's they never would have had a chance to be in. As companies leases run out on their offices they'll switch tune from demanding RTO to instead performing layoffs in higher salary & COL areas to drive down wages further. The "outsourcing scare" of the early 2000s was overblown. It will be for real this time.


Lots of companies outside of tech need tech products. The best way to build the right things for customers is to get to know their customers and their business, from engineers to whoever. For one, that’s really hard to do through a laptop screen. Probably more important is it is much easier for person in country/culture X to familiarize themselves with a business and its customers in country/culture X over Y.


100% agree. But I suspect a lot of CEOs and their shareholders don't.


It's already happening. Tech companies are laying off U.S talent, but expanding in low cost places like Poland and Japan(surprisingly). I suspect part of this is naive Americans, but I think there's also a psyop from these low col areas to normalize remote work. This WFH craze is going to accelerate America's decline as we enter peak abstract society and workers just become an email and slack name, easily replaced by one of a million applicants from around the globe. We need to start having honest conversations about how to handle our decline gracefully so that the drops in the standard of living won't be too brutal.


I will second this. I work for an American company that now routinely outsources to Eastern Europe. Wages are 1/3, most speak English fluently, and the time zone differences aren't really that big of an issue.

It's much different than my experiences outsourcing to Asia around a decade ago. Many pushed for remote work, and the consequence might just be that they are out of a job.


Yes. Step 1 of remote is “are you really better than someone in Indiana with an $800 mortgage payment?” Step 2 is “And are you really better than the European Phd at half that cost?”


My Indiana mortgage payment is $500, thank you very much.


Haha I had one around there too. Indiana life is good life.


Unless you need an abortion.


Step 3 is “European PHD, if you’re really so good why are you half the cost?”


And the answer being "the limited number of local companies paying high wages and the limited number of American companies willing to hire someone in Eastern Europe reducing my leverage."


Step 4 is “European PHD is not as good as we thought, we need to start over."


because in Europe his local salary is at a tenth of the US offered. knowing that, you can say his actually the five times premium.


For a similar reason why a person pressing a button to pour coffee in Starbucks on Manhattan gets 10x more than a person pressing a button to pour coffee in Starbucks in São Paulo.


With so much wealth in America, do you really think life is going to be better for you when you're competing against everyone in the world? Sure, I bet the people in Indiana, or India are happy, but your SF salary is going to be demolished, and your local community is going to death spiral as to the wealth evaporates.


Not really. Employers that know what they’re buying will spend the money.


I recently was told to discuss things with a team in Serbia. Knowing not enough about the world, I expected it to be painful. I was blown away that they spoke English more properly and more clearly than a good portion of Americans even. And it felt more natural, not like the Britishesque English you tend to hear from India.

If it weren't for the time zone, I'd have really questioned my near term job prospects.


Your first point hits it exactly.

Let's say that it's year X and you have two choices: either you bet that the future will be bleak, and cut cost and hunker down, or you bet that the future will be bright and make big investments.

If X is 2022, the first bet will be right, while the second will lead to bloat and excess. But if X is 2014, the second bet will be right and the company will gain market share, while the first bet might make you go out of business.

So big tech companies weren't so irrational to make those big bets to begin with, and they simply overcorrected the other way around afterwards.

Now, as an employee in one of these I still think this is shortsighted and will damage them long term. But the stock market keeps rewarding these companies handsomely, so what do I know?


The thing I am most excited about is the diaspora of talent from the big tech companies. I'm hoping the talent vacuum slows down and we can infuse all those extra minds into more startups, more innovation, more value growth on mid-sized companies.


I'm thinking optimistically about this as well, however, it's also true that someone who does well in a large corporate environment and who only knows large corporate operating procedures could struggle to adapt to the needs of startup in a timely manner. A Google or Meta staff-level engineer is a pretty useless skillset I'd say for a company with only a few people. Plus, those folks likely also have the personal networks and charisma to bounce to another high paying position elsewhere that needs those skills and is willing to continue paying for them.

Though, I am hearing of a few who struggled to find a comparable job with comparable benefits ended up starting something while they update their skills. I guess the question is, do they have the network and reach to turn it into a business?! Having serious downtime to decompress is a major catalyst for any of us drained of our creativity. So, at the very least, once VC money starts flowing again, and that VC money is not just obsessed with chasing LLM + X idea, then we might see this new wave of startups.


As a software engineer over the past 10 years, this environment has been horrible. What happened is that it has been very difficult to find good projects to work on which were not bullshit; I.e. doomed from the start.

Finding that good 1% of companies has been like finding a needle in a haystack, especially as an outsider where you don't even know what the needle looks like.

Also, tech recruitment has been broken since the beginning. It has been based almost entirely on resume buzzwords (in the pre-selection phase) and one's ability to solve puzzles quickly within a constrained time frame (for the selection phase). Neither of these things are relevant to the job of software engineering; what's relevant is solving problems in an optimal way. HR who worked for big tech could have put focus on engineers' track record on past projects instead of ignoring it completely. Also, in latter years, DEI quotas further distorted the already COMPLETELY BROKEN recruitment process. Like rubbing salt into a gaping, infected wound.

I'm looking forward to high interest environment. I hope there is a huge crash and that most tech companies go bankrupt so that the industry can finally make room for people who know what they're doing.

Just look at what's happening in society and tell me that tech has been successful. Only innovation we got are LLMs, but wait to see how they're going to be applied to industry... If we keep going down the current path, it's not going to be good.

The tech industry over the past 10 years has felt like a show, completely detached from fundamentals. I was hoping that I would be given the chance to compete on a level playing field. It didn't happen since I started my career.

Every playing field in the tech sector has been rigged and distorted by private equity, national governments, foreign governments, intelligence agencies, big corporations, startup incubators, the media/social media, law firms... Absolute psyop from every side. Every well-connected corrupt midwit and dimwit benefited; everyone except skilled creators.


The idea that a crash would benefit talented people more than the bulshitters is not really all that likely in my opinion... More likely that the chosen few get put on life support through government contracts or entrenched work, and innovation scales back in total.

I almost expect a higher degree of bullshit to be honest; the cost of a startup will be higher, so the "honest" capitalistic speculation will be reduced, but there will still be plenty of money for DEI and CCP projects.


I can see how that might be true in terms of jobs but I suspect a crash might still open up opportunities on the business side as a bootstrapped entrepreneur.

The problem I've been having is that it's been difficult to find anyone to try my SaaS product because it appeared as though everyone was hooked to easy money in their big corporate jobs and nobody wanted to lift a finger to try anything even slightly risky. I couldn't even find non-tech marketing people to join my startup. I couldn't find other startups willing to do any partnership.

After tech layoffs, I noticed an improvement already; now people are responding to my messages, I'm seeing more small startups in my area and they're open to discussions about partnerships. Very big shift already. I think if big tech collapsed, people would just snap out of their trance and be forced to find more creative ways to earn money which involve some risk.

The problem we have now is that big corporations have made it possible for many people to earn good money while taking on very little risk. Such environment makes everyone extremely risk averse. They don't even want to take the risk of rubbing their employer the wrong way by doing something on the side.


Im guessing my experience is very different; I work more in robotics than pure software, and this is an industry where many of the problems are investment intensive...

I wouldn't call it a crash in this field, but there is definitely a trend away from ambitious attempts. Its not really a field where you can take your savings at make a go for a year or two, capital is required, and capital is more hesitant at the moment.

I am curious about your product, to me SaaS startups aiming to replace an existing always had the problem where the risk to your client was immense compared to the value you offer. For example even if your service is a hundredth the cost of the entrenched it might not be worth it, because its a saving on a small line item against what could be the entire turnover of a business.


I fully agree.

Low interest rates mean companies/capital-providers look to speculative places (like new software products) to hit their growth numbers.

Higher interest rates mean companies/capital-providers return to known/well-understood places to hit their growth numbers.

I interpret the reasons in the article as well downstream from these kinds of investment decisions.


Have a slightly different view. In times of high interest rates, you are rewarded for profitability, not for growth. When interest rates are low, your company needs to grow or it will be outgrown by its competitors and becomes a takeover target. With high interest rates unprofitable companies (which is a side effect of a growth strategy) slowly bleed to death and become takeover targets by themselves.


Yes, thank you for distinguishing between profitability and growth.


Hiring overseas erodes America's talent pool. Ever wonder why no one knows how to make anything here? Our countries talent is being gutted by the entire world. Our desire is to become professional consumers what a joke


This feels like an emotional take. There are lots of people in the USA who are great craftspeople…but our economy has moved more towards computer technology specialization, because that is where a lot of the money. I’m sorry, but making widgets as a source of GDP is just a phase of most economies, it doesn’t make sense to pay someone here 1$/widget when you could do it overseas for 5c/widget.

I don’t disagree with you that the US seems to be spiraling on the consumption front, and a lot of this is due to an epidemic of advertising.


You can think whatever you like. Just look around. Observe what people around you know.

"it doesn’t make sense to pay someone here 1$/widget when you could do it overseas for 5c/widget"

Why doesn't it make sense? Because you want more money? What do you think will happen long term to our talent pool?

What I have personally seen, and I can only speak from my own perspective is that more and more people offshore and outsource their work. Someone gets good enough at something, and then they outsource it but not domestically they outsource it out of the country. And now those effects are compounding in a major way. Our culture doesn't value craft and we are continually devaluing it.


I’m perfectly happy taking a free walk in the woods, not being after more money. But apparently you don’t understand economics.


I wasted my time reading your reply. I should outsource my comment reading!


Anyone who has studied economics has to know our current state of knowledge on the subject is a complete farce.


> But since my company is fully remote and distributed, the downside to hiring in LatAm and Eastern Europe has been significantly reduced.

The WFH zealots keep downvoting any comments about this, but the offshoring of tech employment and massive layoffs were the predictable, and predicted, consequences of the massive push for WFH.

Either agglomeration effects exist and people work much better when they were physically collocated, or they don’t exist and companies incorrectly believed their employees worked much better when they were collocated (to the point that most companies were willing to relocate someone from Minnesota to SF and pay them an additional $100k than they would need to otherwise), the net effect is the same.

By insisting on WFH you’ve given companies absolutely no reason to continue paying 5-10x what they could pay the same person if they were living in a low CoL area.

The irony, of course, is that the WFH zealots understood this. Heck, many of them said this was an advantage of WFH as it would bring more jobs to mid tier cities in the U.S. etc.

Of course, being zealots, they wouldn’t take the argument to its logical conclusion, which was why would a company ever move a job from SF to Sioux Falls and shave off 20% when they could move into Buenos Aires and shave off 80-90% of the cost instead.

The icing on the cake, however, is that people outside the US are far more likely to be working from the office and wanting to work from the office. So not only are they cheaper for companies, they also benefit from the real benefits or the perceived benefits of the agglomeration that had companies paying their employees so much more than they needed to in the first place.


> When money was essentially free to borrow, it made all the sense in the world to make a large number of bets because the odds were on your side that at least one of them would pay off. Now, however, each bet comes with a real opportunity cost, so companies are making fewer speculative bets and thus need fewer people.

I disagree with this. Interest rates were very low for a long time. Tech hiring and salaries only boomed because of COVID. Companies needed to adapt or they would not be allowed to function; and these companies needed new software tools to handle it. COVID also encourages quite a few people to retire or move companies. Why work here if you can be in the hospital for another month? That same mentality also contributed to job hopping. Companies over hired just to deal with growing turnover. They said they over hired.

Interest rates do not help, but before generative AI, the best new startups would do is put an app for a traditional service or sell NFTs. It was not real innovation in those low interest times.


This is false the rockstar salaries in software started in the early 2010s with the rise of the worldwide software (SAAS) company many CS jobs started paying more than doctor or EE jobs even though the barriers to entry were lower. This directly coincided with the savings bubble from the first boomers - age 65 - finishing their peak savings years 2010 - 2022. This ZIRP savings bubble isn't coming back!

I've got news for you folks. This gravy train is not coming back ...


I’d also say that there is less speculation since guaranteed investments carry higher payouts tied to interest rates and therefore the risk vs reward calculation swings the other way. If you invest 1 dollar into each of 100 ideas expecting one or two to hit and return double your investment but interest is at 5% it just doesn’t make sense to gamble your investment.


This ^


>I'll tell you that I'm getting overseas talent for roles where 10 years ago I would have hired entry level talent in the US. But since my company is fully remote and distributed, the downside to hiring in LatAm and Eastern Europe has been significantly reduced.

I really don't get the argument that remote work means that it's easier to outsource jobs or hire overseas talent. The fact that you can get overseas talent for a fraction of the cost of hiring domestic talent sounds to me like a vastly more compelling reason to hire overseas or outsource, and yet companies nonetheless hired a lot of US talent despite this fact. Microsoft and Google had the experience and know-how necessary to make a globally distributed workforce work, and yet the headcount at their overseas offices were paltry compared to the number of staff employed at home. Why is that?

How does a change in work culture tip the scales in any meaningful way? Especially compared to that?

Yeah, sure, you hear stories about companies hiring more overseas talent now that remote work has been normalized, but I have a hard time believing that things would be completely different and they'd be a largely US-based shop if it weren't for remote work being a widely accepted thing.

If you aren't convinced by now that this argument is bullshit, take a look at those companies who were fully-remote and distributed waaaaay before it got popular, especially those who have maps of where all their employees are located. One thing you'll note is that, despite being fully remote, most of their workforce is domestic, and a relatively small portion of their workforce lives in countries that have gained notoriety as outsourcing destinations.

Automattic, is a great example of what I'm talking about: https://automattic.com/about/

Just to clarify, I'm not dismissing overseas talent, I'm dismissing the idea of overseas talent being used a bogeyman that might suppress wages and take our jobs. If it were that easy, it would have happened already.


> How does a change in work culture tip the scales in any meaningful way? Especially compared to that?

Two ways:

First, when everyone is remote, it puts everyone on an equal footing. When you have mixed onsite and remote resourcing it's much easier for a divide to creep in.

Second, the technology to support true remote work really accelerated during the pandemic and feels practical in a way that it hadn't before.

Large companies in any industry are generally not trendsetters but laggards because they have so much invested in the systems they already have in place. Moving all of Google or Microsoft remote is a daunting task. Starting a new company fully remote and then growing to Google or Microsoft's scale is much more practical and I think something we'll see over the next 10 to 15 years.


>Large companies in any industry are generally not trendsetters but laggards because they have so much invested in the systems they already have in place. Moving all of Google or Microsoft remote is a daunting task. Starting a new company fully remote and then growing to Google or Microsoft's scale is much more practical and I think something we'll see over the next 10 to 15 years.

Yeah, but Google and Microsoft both already have offices in India. They have logistics and infrastructure that have been in place not just for years, but for 2 decades. Nonetheless, latest figures I could pull for Microsoft's total headcount in India was 18,000. Impressive, until you consider that they employ 238,000 employees.

I'm not saying it's curious that they haven't gone fully remote by now. What I'm saying is that it's curious that their talent pool isn't more evenly distributed across the globe despite the fact that they've had a 20 year head start.

Sure, large companies aren't trendsetters, but by this point outsourcing is pretty conservative. Just within tech alone it was something that was talked about in the very early 2000s.

Regarding the two points you bring up, agree in that it makes hiring globally distributed talent easier. It's no longer inconvenient to make special accommodations for an employee in LatAm when you're making the same accommodations as everyone else.

Nonetheless, you still have companies that were fully-distributed and fully-remote well before it became trendy. They made the same special accommodations for everyone. The technology available at the time was already good enough to allow them to hire talent from anywhere across the globe. Even so in many cases it's the exception rather than the rule that their employees live in a low CoL country rather than a high one. Most of the employees tend to work either in the US or Europe. Take any prominent company that's been remote since the early 2010s, look at their about page, you generally see this trend.


> Second, the technology to support true remote work really accelerated during the pandemic and feels practical in a way that it hadn't before.

I've been working remote for the better part of 14 years and I completely disagree.

Slack existed before the pandemic, Jira existed before the pandemic, video calls existed before Zoom.

What tech are you referring to?


Which countries in Eastern Europe and Latin America do you find most of your talent?


Not parent, but I’ve hired and worked with a lot of great folks from Argentina, Romania, and Ukraine.

Other countries from which I’ve hired fewer, but generally solid, folks are Spain, Poland, Columbia, and Costa Rica.

Usually as contractors. Full-time employment laws in other countries are very, very different from the US.


Other than cheaper rates/salaries, doesn hiring overseas helps with taxes in some way (vs hiring domestically)


Hiring folks on contract was less expensive due to the lack of US domestic taxes (e.g. social security).

With Section 174 changes, though, that's not the case anymore, and I've spent a lot of time understanding that (because of the 15 year amortization). I won't get on my soapbox about Section 174 stuff and derail this thread. Check it out, though!

edit: missing content/context


https://blog.pragmaticengineer.com/section-174/ is a good read on Section 174, for those who might be curious.


Gotcha. Is that making you lean towards hiring domestically now instead of overseas, or is the cheaper rates still worth it at the end of the day?


I haven't read up too much about 174. Are you suggesting contractor pay can't be written off anymore as well, and thus is less attractive, even at a lower dollar/hour cost, than hiring domestically?


Colombia*


Right now I have team members from Ukraine, Croatia, Montenegro, and Romania in Eastern Europe as well as Chile, Argentina, Brazil, and Mexico in LatAm.

Some countries have more stable talent pools than others - Ukrainian engineers tend to be notably high quality while Mexico can be very hit or miss.


It may sound novel and egalitarian to you but the next time there is a layoff you're going to be the most expensive employee at your company...


We hired some great talent from Argentina. Great tech and English skills, and the similar time zone was key.


The downward pressure depends on where you're sitting. Here in Kentucky, being able to pull in a FAANG salary that's based on the cost of labor in Seattle is a huge upswing in salary.


> the rise of remote work and the downward pressure that it puts on wages

I've seen the reverse.

The same position pays easily 20% more than 4 years ago. (Though a lot of that is general inflation.)

I work in the US, but not a large city.


Yes, it might increase for those outside the city, but for those inside, it's overall downward pressure.


The last 4 years were 4, 8, 5, and 1% inflation, respectively.


And a lot of inflation is wages.

So when you talk about upward/downward pressure on wages, you can't discount inflation, because inflation is wages and the product of that upward/downward pressure.

You can talk about real wages.


It wasn't? The majority of that inflation, certainly in the EU, was energy and profit taking in uncompetitive markets, the result of decades of failed anti-trust?

This stuff is researched these days, we don't need to repeat 1930s tropes.


I mean everyone talks about 0% percent interest rates.

Effectively, why didn't businesses just borrow a boat load of cash - any sane person could posit that it wouldn't last forever, after all - and build up hoards of cash independent of revenue for the future, as to deploy it when everything craters, there by making things like hiring and acquisitions cheap?

That seems like it would have been a good use of money, what everyone effectively is saying (talking head media wise) is that businesses were able to get money at "little to no cost". It seems likely then, that borrowing enormous amounts of cash, investing it in relatively safe growth securities, and waiting for your competitors to eventually crater because they are living high on borrowed time would have been a viable strategy for some businesses.


Some definitely did. I distinctly recall my previous employer saying they were doing exactly that 5-6 years ago during an earnings call (keeping a large loan balance because interest rates were low, and they could use the money in the future).


> At Automattic last year we did not do layoffs, but allowed performance management and natural attrition (voluntary regrettable was 2.9%, non-regrettable 6.8% for us in 2023) to allow our size to shrink down more naturally

"Non-regrettable" attrition means either that the person was fired or laid off. Since the post says Automattic didn't do layoffs, that means all of that attrition took the form of being fired because of performance problems.

If 7% of the company exited the company in a year because they were fired -- which seems quite high to me -- then even if you don't call it a layoff, it has many of the same characteristics as a layoff.

In a classic layoff situation, you've over-hired, expecting future growth, and then the winds change, and you need to reduce the workforce to match the declining growth.

In a case where you're firing 7% of the company because of performance problems, you haven't over-hired, but you have certainly poorly hired!

Either way, now the remaining workforce has to adjust and "do more with less," and I'm going to guess that most of them have a reasonable fear that they might be next on the chopping block!


> "Non-regrettable" attrition means either that the person was fired or laid off.

Unless that's a technical/industry term, I assumed it additionally means people who quit, but the company did not mind seeing them go. That is, their performance was only fine-ish, maybe not down to the level where they'd get fired, but also not at a level where their departure would be lamented by their manager.


That's right.

It's common to distinguish the two.

E.g. a sales rep is underperforming and decided to leave. The decision at time of termination was technically the employee's (not fired, not laid off), but the company would not seek to retain or rehire.

In companies with good communication and clear incentive structures, this is a somewhat common occurrence, as employees realize there is not a good opportunity for them.


I don't think the methodology is completely standard, but you generally want an objective way to differentiate between the two forms of attrition. One way to do it is to define non-regrettable as having documented performance or behavioral issues at the time they decided to leave. Another way would be to do talent surveys to get an evaluation ahead of time.


Yeah, and any manager will do their best to mark anyone they can as non-regrettable because a manager isn't gonna last long if their good employees are the ones leaving.


I took it to be that their contract just didn’t get extended.

Not fired, not laid off, just not continued past some initially agreed upon end date.


You are correct. That's exactly how I've always seen it used in industry.

It also implies company wouldn't let them back in with a boomerang.


Is 2.9% and 6.8% acceptable or even 'good' levels, even for the tech industry? I'm not quite sure. A 6.8% performance-related quit/termination seems pretty high from my limited anecdotal experience.

Edit: now that I'm thinking about it, I don't think I worked at a tech company where people were fired/asked to resign more than 2.5x than those that resigned on their own.


2.9% would be excellent if sustained. It would correspond to keeping a significant majority of the employees you want to keep for more than 10 years. That's generally quite rare.

I'm less clear on how to assess the 6.8%. It seems somewhat significant, though if you're hiring many people, that's a period where you might expect churn, as some of them don't work out.

Of course, you can't extrapolate any of this, as 2023 was a year when employees would be very averse to moving, and it was also a year when many companies were coming off of previous hiring sprees. So expect the 2.9% to eventually increase.


> Is 2.9% and 6.8% acceptable or even 'good' levels, even for the tech industry?

I would put 2.9% at the very good to low level. It suggests 100% turnover every 33 years, which is fine, especially for the tech industry.

6.8% for performance strikes me as an indicator of very bad hiring and/or onboarding. A charitable view would be that many years of bad hiring got dumped in one year (so each year only had a small % of bad hires), but I wonder if that was the actual case.


"Non-regrettable" just means the company wasn't too sad to see them go, not that they were necessarily forced out. They could've been a poor performer that found another job on their own, and they wouldn't want to rehire them.


You can steer attrition by so many parameters - compensation, (non) promotion, change in benefit plans.

And again, a "non-regrettable" termination can also apply when the employee quit.


Another trick: In order to avoid being on notice as a manager because too many people under you quit, just declare it "non-regrettable".


Most large companies aim for ~10% total turnover a year.


> "Non-regrettable" attrition means either that the person was fired or laid off. Since the post says Automattic didn't do layoffs, that means all of that attrition took the form of being fired because of performance problems.

> If 7% of the company exited the company in a year because they were fired -- which seems quite high to me -- then even if you don't call it a layoff, it has many of the same characteristics as a layoff.

In my experience, this will have a way worse effect on morale than layoffs and will trigger regrettable attrition, as the smarter people in the room realize that you're always one reorg away from having a 'performance problem'. Sad.


Isn’t it the other way around? If you’re a decent performer you should be more afraid of random layoffs (like FAANG in the past 2 years) than performance based layoffs.


Not necessarily. When it comes to someone's livelihood, 'performance' is subjective. Have you ever been involved in employee performance evaluation for stuff like stack ranking? It's very easy to spin a failure as a success or the other way around. Say you're a decent performer and a reorg will make you end up under someone who does not have the same belief, even if by objective metrics they may be wrong. You become a low performer without any chances in your objective performance.

FAANG style random layoffs if done like ripping off the band-aid are not ideal but, in my view, ritualistic performance-based firings are way more draining for the morale of any organization. Not saying that companies should keep low performers onboard, I'm just saying that if you involve the incentive of reducing the number of employees, management will find low performers everywhere and will drive the other employees either into burnout or to their competitors..


You are always one manager away from not being a decent performer and good managers tend to be promoted or leave, sometimes faster than you do.


This. The more you reorg the less confidence I have that management knows shit.


Not necessarily. Depends on their definition and process.

“Voluntary regrettable” would imply the employee left voluntarily but regrettable because the company would have preferred they stayed.

“Non-regrettable” means they left for reasons that aren’t regrettable such as performance or natural attrition. Either they left on their own or were fired. Hard to say since they don’t specify.


Typically "regretted attrition" is employee managed. "Unregretted attrition" is employeer managed. Regardless of the cause.


Regretted attrition == The employee left and the employer regrets it (They would have wanted the employee to stay)

Unregretted attrition == The employee left and the employer is glad they did. Good riddance.


I’ve also seen people who were good performers but just difficult to work with that were categorized as non-regrettable.

“Hi boss, I’ve got another offer but wanted to give you a chance to counter” -> “Congrats on your new role, why don’t you go ahead and wrap up by Friday? I’ll skip over to HR and get the paperwork handled for you. Do you need a box for personal belongings?”


6.8% seems like a pretty reasonable target for non-regrettable attrition. GE famously aimed at firing the bottom 10% of performers every year. In Facebook engineering management the internal target was 6% IIRC.

It feels a bit paradoxical because there are many 10-person teams with no poor performers that should be fired. Yet if you have an engineering director heading a 150 person department and that department allegedly has no poor performers, that probably indicates a lax culture that accepts poor performance.


> In Facebook engineering management the internal target was 6% IIRC.

I haven't seen a NRA rate enforced at Facebook, however 6% is the target at Amazon.


> It feels a bit paradoxical because there are many 10-person teams with no poor performers that should be fired.

These bottom-dwellers are often spread out amongst teams so that there is low-hanging fruit on each team.

The conflict happens when multiple managers try to hold on to next year’s sacrificial lamb as well.


Doesn't seem unreasonable, as long as it's a soft target and at a large level. When every manager needs to fire one employee every year or two or get fired, it Uber sucks.

Any time a manager needs to fire a performing employee, the company dies a little. You start destroying cooperation, and selecting sociopaths as managers.


> now the remaining workforce has to adjust and "do more with less,"

If those cut were actually low performers, not necessarily. If the attrition was dead weight or worse, incompetent, their good performing peers might do much better since they’re no longer cleaning up someone else’s mess or being demoralized that someone is getting away with doing nothing but still getting paid. The manager of dead weight gains time to be more involved or strategic, peer teams no longer waste time communicating to a useless person, small mistakes stop multiplying, etc.

It’s situational, but I don’t think cutting low performers is a problem if there’s serious performance problems and it’s not just being used as an excuse or being applied blindly like stack ranking.


> "Non-regrettable" attrition means either that the person was fired or laid off.

No. Every leaver is classified as a regrettable/non-regrettable case - this was a part of my last exit interview when I quit the job."non regrettable" just means you have very low chances of being hired by the company again.


This should shed light on what the author meant by those terms: https://www.linkedin.com/pulse/how-determine-regrettable-ver...


In my experience, the USA is shitting on its entry-level engineers. I know 3 junior engineers unemployed 6 months after graduation, from top 20 to top 40 schools. They have 10 total internships among them.

One had a zoom, corp job with 24/7 hours they had no life and quit. One had their offer rescinded 4 weeks before start date. One had their position cancelled 60d after start date. All are now longterm unemployed with virtually no interviews.

Meanwhile overseas hiring and 85k H1B workers are flooding the job markets annually, its galling ...


As the top comment says, this is sadly a consequence of remote working.

I started my career in the dot com boom a quarter century ago, and when it burst there was initially a huge rush of fear that "everything will be outsourced to India and China". That never really panned out, primarily because (a) teleconferencing tech wasn't nearly as good as it is now, and (b) the huge timezone differences were an absolute killer for productivity, especially in a world that discovered things like fast cycle times and continuous deployment were a critical competitive advantage.

Now, though, I think a lot of companies have learned from those problems. I see much more outsourcing to places like Latin America (same or nearly same timezone) or Eastern Europe (obv. not as good timezone overlap but still doable if you have US workers start a bit earlier than usual and European a bit later). Also, since nearly everyone, even in "RTO" offices, spends a huge amount of time on Zoom, etc., it's much easier to have outsourced workers treated as fully equal team members.

I agree, it totally sucks for US entry-level devs. Based on my experience in the US of how we outsourced other previously critical competencies, I don't have a good answer.


Class interests supersede national interests.


> Sam Altman when he says there may someday be a billion-dollar company run by one person who is able to highly leverage future AI agents to automate most traditional roles at a company.

Altman is nearly right, but Warren Bennis had it better;

The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.

It's not that companies will "leverage AI" but that AI will leverage humans. Companies are already "AIs", but that will become more and more apparent as time goes by.


We build cybernetic systems called companies with the goal of creating/increasing profit. As we get better and better at making these systems, humans play less of a role and the roles we do play become ever more formulaic.

I wonder when we will hit a turning point when over 50% of people in America would say “yes” to the question “is my boss a computer?”

For instance if you work at MacDonalds, you probably have a shift supervisor, but the beeping monitor showing customer orders and wait times is the real head honcho. Piss off your boss and you’ll get a talking to. Piss off the computer and you’re out (quantifiably!)


Reminds me of "Manna" by Marshall Brain[0]. It's a short story about a fast-food restaurant where the manager is a computer that directs employees through an earpiece, with the power to hire and fire.

[0] https://marshallbrain.com/manna1


Interesting observation about McDonald's.

I work as a doordasher and have never once spoken to or interacted with anyone who works at the company.Well that's not quite true, I had an order with issues once that went to a CS rep in the Philippines who I chatted with via messaging.

For all intents and purposes my "boss" is a computer.


> We build cybernetic systems called companies with the goal of creating/increasing profit.

Except CEOs such as Google's are incentivized to maximize their own payouts. It just happens that during their tenure, this happens to be aligned with shareholder value.


Yeah, neither of those things is going to happen in our lifetimes. Regardless of what Altman says.


"in our lifetimes" is a very wide range here. There are probably both teenagers and octogenarians on HN. Did you mean your statement to apply also to the teenagers?


When it comes to fuly automated factories owned by AI-powered one-person unicorns, that would include said teenagers children if not grand children.

AI-powered billion dollar valuated over-hyped start-ups, well, those risk happening this decade.


Lol I agree with you. I think ambitious ideas about automated factories are made by people who have never worked in a manufacturing, much less automated industry.


Play the game "Flateye" [0,1]and come back to us on that :)

[0] https://flateye-game.com/

[1] https://www.youtube.com/watch?v=5HdYFdcfdXk


Try flat eye at managing a factory, and come back to tell me how it went.

A game as a reference, seriously? No idea which film I recently heard the term, but amateur hour doesn't even come close to decribe that...


Did you just actually take me giving you a joke game as a "reference" seriously? I'm not sure what worries me the most now.


Unfortunately, I cannot tell anymore sometimes...

As a joke, it is right down my alley! So please, take my apology.


We're cool hef.

At least I assume it's a joke. Maybe the jokes on me, but the review seemed to paint it that way.


This is a total fantasy. The amount of hype around these "guess the next word" games is so far beyond ridiculous at this stage I just don't know what to say.


It is a ridiculous fantasy, yes indeed. But the danger with AI is not what it's actually capable of, it's what people think (hope and fear) it's capable of, and how that cultural effect impacts real life.


Absolutely right. As I also put some past comment, forget cutting edge ML/DL/AI etc even basic tools from last couple of decades like timesheet/hours management, ticketing/issue management can totally control >90% of tech workforce and possibly even more non-technical workforce. The future I see AI deciding what minimum level of sustenance, if any, to be provided to humans based on their usefulness to prevailing economic systems.

Most people already run in a tight loop routine all their life. The great learning for them with futuristic AI could just be realizing this plain old fact.


> The future I see AI deciding what minimum level of sustenance, if any, to be provided to humans based on their usefulness to prevailing economic systems.

For this you believe you need to look to the future?


> Some of this productivity gains just come from adoption of existing tools like Google Workspace or Office 365, issue trackers and version control with tools like Gitlab, Github, or Jira

I guess I haven't seen Jira be something that actually improves productivity. More often it seems to be a productivity suck.


I’ve usually found that to be a symptom rather than the root cause: if Jira is turning into a job rather than a tool to do your job, switching tools without switching managers won’t change much.


The value comes from writing down what to work on in advance, and figuring out if it's really that value. A text file is just as good as Jira, but harder for multiple people to edit.

The "better" issue trackers (Linear is the good one), just have a better UX for the same basic task.

But you have to look at non-tech businesses and compare to ours. Like, my property management company doesn't have a ticket tracker, so if you need something, you have to call them and remind them, causing them to context switch, and perhaps prioritize your low-priority item today when they could have done it tomorrow... if they remembered it was an issue. Jira is about remembering what you need to work on, and which thing you should work on first.


> The value comes from writing down what to work on in advance, and figuring out if it's really that value.

The most valuable tasks that I work on come mid-sprint, are emergent, and I have to create the ticket myself. Without fail.


Office applications (spreadsheets excluded) also look like a tremendous productivity sink to me.

Anyway, the answer the article is looking for seems very clear to me. The informatics companies got into a hiring scare recently, and are trying to undo it with a firing scare. Those companies appear to be run by bozos, and trying to look into theoretical fundamentals for their behavior is a fools errand.


Consider the old-school alternatives, though.

Before we had computers, we had electronic word processors. They were a step up from typewriters, but computers made creating and editing documents so much easier and streamlined. No need to retype from scratch if there are errors that require too much correction fluid/tape, or if you need to add or remove or rearrange paragraphs. And then pushing that further, putting the documents online in a way that allows for easier collaboration, means no more sending .doc files through email and keeping track of who has made what changes, and merging everything into a final document, manually.

As much as we all despise the stereotypical PowerPoint presentation, these sorts of apps let you create presentation material without expending too much effort. Otherwise you're creating poster boards by hand, or typing something up and photocopying it, or creating some kind of overhead projection thing. Collaboration is again so much easier.


If you go looking for how much productivity those older paper documents enabled, the answer was already all over the place. Computerized ones have the unfortunate feature that they are so cheap to manage that people were pushed into creating all the questionable-value ones, and then some clearly negative-value, and then more.

Instead, structured information is what was already very valuable, became more valuable in a computer, and got even easier to create. But the one defining feature of structured information is that Office applications can't deal with it at all.


In theory, it can be a productivity enhancer, if you're comparing it to absolutely nothing. However, if you compare it to, honestly, post-it's on a whiteboard that are halfway organized, or a plain txt file that all the people can get to (if it's a remote company), then frankly it's almost always a productivity suck.


Jira sucks, sure, but going from no-bug-tracker to any-bug-tracker (and by “bug” I mean tasks of any kind) is a huge boost.


I don't think productivity gains explain why companies need fewer people to be employed. A better explanation is that tech companies have grown too big. They've established big enough moats to prevent competition from arising, and if for some reason competition did arise, they can just be bought out.

So we have something like a monopoly/oligopoly situation where companies are in a rent seeking phase and don't need as many employees as before when they were in a value producing phase.


> I don't think productivity gains explain why companies need fewer people to be employed

Right. If these productivity gains actually scaled, then why not add more people?


I’m a security engineer, so my work is a bit different than SWE - data wrangling and analysis, tying systems together and correlating events across them, building defense in depth, cool and effective under stress.

That said, LLMs are showing up a lot in my job. Being a good sec eng is having a 70% base in most systems, solid comp sci and programming chops (I call it I can build and run a mediocre app), and solid security expertise.

GPT is really good at radically speeding up how to get past the 70% starting point I usually operate from. I run (sanitized) terminal output through it so the 60% of the output I table to RTFM later to understand, I can understand immediately via GPT. Sec eng benefits a lot from leaning into pandas/notebooks vs log csvs, and GPT does that really well too.

The big marker for me is incident response - standardized tech data requiring analysis and correlating in a pinch. I’m going to have an incident response LLM partner soon enough. Analyzing open source codebases for how they do input sanitizing with a new to me language? LLM partner walking me through.

All this together - goodbye entry level cybersecurity jobs in a few years I think. Many of the things you’d need a security analyst for or the more busy work sec eng 1 jobs I truly think are turning into LLM jobs. My lived experience this past year reflects it.

I think layoffs, and productivity gains from LLMs are under the hood of tech layoff tight now. Curious if other engineering tracks are seeing this though? A SWE buddy at Google thinks so.


I'm 15+ years in security and just this week I needed to hit a few domains, find a script tag that imports JS from a certain CDN, parse and make sense of it.

After 20 minutes of telling ChatGPT exactly what I needed and a couple of test runs and optimizations, I had the perfect tool. 10 years ago this would have been a half to full day project.

I had a meeting with an entry level security engineer and he asked if I need him to do the task, as we were both in the meeting where it was being discussed.

I didn't even think to ask him! It was quicker and easier to do it myself.

This was the type of project I had been assigned dozens of times during my first job.

I don't know what that means for the future of work but it's changing fast.


Scripts and unix pipe style work flows like this are going to be the easiest to automate. Even 5-6 years ago if you knew where to look for some cutting edge stuff, this was what they were experimenting with in automated code gen space and it was pretty good back then. The difference is you didn't do natural language prompts, you had to be able to ask for things in a very specific way.

It doesn't make up the bulk of development (though LLMs are eating low hanging fruit there too but its going to take a little longer) its a canary in the coalmine for how much this tech will cause some kind of disruption to people's jobs.

Whether it means that ultimately, it'll be like the adoption of the computer itself (an explosion of jobs relative to those who lost them due to the wide scale of computers) or not remains to be seen. It might open up higher forms of work and focusing on more thorny problems.

It also might kill huge swaths of the sector


This. We were supposed to hire a junior dev to help me build out a new app. I'm moving so fast using copilot we just forgot about it.

I'm putting out a complex product at break-neck speed using frameworks and tech I was barely familiar with when I started.

This is not a good time to be a grad or junior dev, or a search engine.


Why did you think you needed to hire a junior dev before even starting work on the application? I know estimation can be a difficult task but the typical "I'm moving so fast..." type experiences usually mean you didn't or don't understand your tooling or the scope.

Also how were you going to take on a junior dev and a new framework at the same time? Were you expecting them to know the framework?

As the saying goes though the last 20% takes 80% of the time.


Because the project was big enough to warrant more than one person. I have a whole team surrounding me to handle non-technical/non-development incidentals. Most companies would have had a lot more budgeted and would have pre-hired five devs. Then everything would have moved glacially slow, fulfilling the prophecy that five devs were needed.


  > Because the project was big enough to warrant more than one person.

But based on what, the scope? If you weren't familiar with the tech stack how would you gauge that? I understand people can conceptualize frameworks at a high-level.

  > I have a whole team surrounding me to handle non-technical/non-development incidentals.

Are these the people finding the junior or (5) devs that would be needed. Do they have experience with the framework to know how to scope the project? The hiring of 1 - 5 developers in-house or even as contractors is a labor intensive process so I'm not really sure companies would have just done it based on an idea of an application. I can see where they might have hired early based on winning a contract but they probably under estimated the work if that was the case or padded the cost to account for ramp-up time.

  > Most companies would have had a lot more budgeted and would have pre-hired five devs.

Maybe you haven't worked places that do spikes or just allow people to develop prototypes without entire scoping documents or hiring people. Also keep an eye on your worth here. If you are saving the company the cost involved in getting (5) more developers then you should be getting a bonus or have decent compensation. A lot people fall in this trap of "saving" the company money as if its their own, its not, and unless you are getting some of that savings you are diluting your current pay and working twice as hard.

  > Then everything would have moved glacially slow, fulfilling the prophecy that five devs were needed.

Yeah this is understood as the "mythicial man month" in terms of things slowing down. Adding the wrong head count is a planning and leadership issue. There is nothing stopping teams from being dynamic at a point but that depends on how long the application is going to be supported. Having (5) people now can spread out the working knowledge and workload enough that "no single" developer is holding up forward progress. If you are having to mentor people on the project or fix mistakes then they are the wrong people or wrong skillset for the team. A leader will be able to convey the issue to management and have people let go or replaced. People don't like to do this but there is no reason to keep a failed process going as we are all professionals. Alternatively people above you have accepted this as part of the application development process, it justifies their job, and are fine with it so getting the work done any faster is just a bonus to them.


Honestly it sounds like it wasn't a tool that is needed often, if it was you or someone else would have already written it. Or you don't regularly day-to-day program enough in javascript / python to do this quickly. There isn't anything wrong with that, as you mentioned, you have entry level security engineers that typically handle those tasks. Creating a tool goes fast when you know exactly what you want it to do and don't have to explain to another person all the requirements and pitfalls to avoid based on experience you might have in writing quick scripts. I don't know if this really changes anything.


Fascinating to hear similar from you.


Entry level devs will need to be much more skilled than I was to enter the field a few years ago.

My internships and the first 6 months of my first full time job are trivial to ChatGPT. Copilot would be needed for work since then (as it is specific to the codebase), but even so, I am far more productive with them.

One of my first internships was hacking together a mobile demo of a digital ID concept. I’d be surprised if it took more than a few hours to replicate a month of vanilla HTML/CSS/JS effort from back then.

I would prefer ChatGPT to me as a co-worker up until about 1.5 years of experience, if simply because it replies instantly and doesn't forget stuff.


Right - I think when the equivalent of CoPilot shows up in incident response, security employment market changes for good. When a “cleared” CoPilot (for govt-supporting work) shows up, total overall.

If you don’t operate in the approach I describe, or are not just an all around tech expert who likes security for some reason, the stable high paying market is around digital forensics/incident response firms. Those folks have a lock bc there’s a small group of who knows assembly and OSs across multiple systems very well and knows if from a security context. Tribal work for a LLM soon enough as it’s just parsing OpCodes and stretching across log sources, end of the day. Scary stuff, glad I’m past entry level and I’m no fool thinking that I don’t have to worry too.


I'm not sure I see this as a reality anytime soon.

  > Those folks have a lock bc there’s a small group of who knows assembly and OSs across multiple systems very well and knows if from a security context.

There is two parts to this. The first is for some of these business in that arena I'm sure if they could speed up analysis to take on more client jobs requiring less labor they would have done so. Second is, what output are you going to provide that wouldn't need the very same people to decipher, validate, or explain "what" is going on?

As an example if you get hacked and you make a cyber insurance claim you are going to have to sufficiently explain to the insurance company what happened so they can try to get out of paying you and you won't be able to say "Xyz program says it found malware, just trust what it says." If people don't understand how the result was generated they could be implementing fixes that don't solve the problem because they are depending on the LLM/decision tree to tell them what the problem is. All these models can be gamed just like humans.

I'm not quite sure I agree that a better LLM is what has been keeping people from implementing pipeline logic to produce actionable correlation security alerts. Maybe it does improve but my assumption is much like we still have software developers any automation will just create a new field of support or inquiry that will need people to parse.


I think the impact of LLMs in DFIR will come down to

- speed at which actionable insights can get generated (otherwise needing a very high paid eng poking through Ghidra and cross-log correlation)

- reduced need for very high paid DFIR engs due to the above.


New devs' code looks like ChatGPT wrote it if it had been trained on pre-existing parts of the codebase. Copy pasta eeeverywhere :p


For this specifically, is no different than when StackOverflow was the go-to solution.


Worse maybe. I used to be able to tell when someone was using SO because everyone was blindly copying the same email regex answer. You can carry Mixtral in on a personal computer and transcribe novel output now. It’s so much harder to detect from just looking at a PR.


80% of being a good security engineer is knowing the big picture, all the parts and how they work. The correlation that LLM produces has no value if its not actionable. You are the one that determines the weights, values, and features that are important. I'd be very curious of how you currently account for scheduled outages, unscheduled outages, new deployments, upgrades of existing systems, spinning instances up and down for testing, laptop device swap-outs, traffic in different silos. How are you baselining normal communications and session timing between services or across protocols? If you are in the cloud is baselining done by service HTTP, DNS, DB, etc? I could see different weights being constructed to represent defense-in-depth but this would seem to be a constant amount of work while also investigating or feeding true/false positives back into the system.

Entry-level cybersecurity isn't a thing which is why it isn't working out as typically you need prior dev, ops, devops, sre, sysadmin, etc existing experience. The talent shortage is because you can't do an undergrad in cybersecurity and somehow pick-up the prior operational knowledge that develops your skills for understanding and troubleshooting how systems, networks, and applications all function together. Cybersecurity as it stands, and you mention, is in my experiences best as a focus off-of computer science. I mean even the CISSP requires working experience in the field.

The one item I think you are overlooking is that you have the experience in how everything works together which makes a tool like ChatGPT or some other analyzer where you can ask the "right" questions a useful tool because you have the mental mapping and models through experience of the questions to ask. So while a security analyst job might go away you are back at the original problem of developing security engineers that know the architecture, flows, daily expectations, etc and having a LLM buddy is not going to turn a security analyst directly into a cybersecurity engineer over night.


> The correlation that LLM produces has no value if its not actionable.

For security, there are two parts to this:

- correlation within detection engines, i.e. what Crowdstrike does: CS and so on are already doing what you describe (baselining normal system and identity behaviors). It is hit-or-miss still, but noticeably better than a few years ago, and I think this current AI era will push it further. These already took away the need for several sec eng hires.

- correlation across logs, i.e. an incident is happening and under time/under stress, and usually this is a IR team putting together ad hoc search queries and so on. LLMs, as many of them seem to have indexed query languages docs and much of the open source docs on AWS, O365 etc, are in almost invaluable tool here. It's hard to explain how quickly security dev across pre-incident prep or in-incident IR are sped up by them.

> where you can ask the "right" questions a useful tool because...

Yes, this specifically is one of the great value-adds currently - gaining context much quicker than the usual pace. For security incidents, and for self-build use cases that security engineers often run into, this aspect specifically enough to be a huge value add.

And I agree, it will exacerbate the existing version of this, which is my point on replacing analysts:

> you are back at the original problem of developing security engineers...

This is already a problem, and LLMs help fix the immediate generation's issues with it. It's hard to find good sec engs to fit the developmental sec eng roles, so those roles become LLMs. The outcome of this is... idk? But it is certainly happening.


> All this together - goodbye entry level cybersecurity jobs in a few years I think

If nobody is entry level how would anyone be able to penetrate the job market? Nobody graduates into a mid-senior level job, right?


That is the question.

I think SWE is going to experience what cybersec already has an issue with.

The “talent shortage” in cyber is for mid levels, not entry level security analysts. This is bc cyber is applied IT and engineering through a security lens. It’s hard to make the jump into a sec eng career bc of this. IMO, LLMs build the wall higher.

To your point, nobody is really graduating right into cyber. Most go to non-technical compliance jobs or MSSPs if they are cyber first, or get really luck on a self-developed path. Rest are lateral transfers from IT or SWE/ops. MSSPs are hard to get out of (24hr security triage centers), and are prime targets for LLMs.

I speculate SWE starts experiencing what cyber has dealt with for years - what do you do if entry level is automated, or the knowledge bar is really high. I think cyber just gets worse/harder to get into.


You will start to see this turn around as companies realize they need to go back to the path of entry -> mid -> Senior/Principal. For cybersecurity this is operations and/or development -> cybersecurity w/ focus on either dev or operations. Then at the Senior/Principal layer people can float between things. This isn't too far off from many other jobs, no EE out of school is designing circuits and boards from scratch its debug what Senior EE's have created or problems in existing products, then you work your way up. Its the same with cybersecurity does a person that hasn't developed software start out doing reverse engineering? Does a person develop or approve security policies, devices, network architecture or designs if they haven't every deployed an application or service in production? How are you determining if something is an incident or valid alert if you haven't managed a network.

When money was free companies could hire people for very specific tasks and knowledge areas because it wasn't costing them anything to get the money. This is why lay-offs in engineering, while smaller percentages compared to other departments in the company, are for jobs that are specialized where in previous times it might have made sense to get a consultant or contractor.


> For cybersecurity this is operations and/or development -> cybersecurity w/ focus on either dev or operations.

Fwiw, I haven't heard or worked at any company implementing this pipeline formally. And, cyber teams (or more appropriately, the industry career thought leaders) expecting it to work this way is a large part of the existing issue.

Fundamentally, under this logic one industry (cyber) is relying on another (SWE/IT) to train its entry level candidates. Logical enough.

In practice, some of the issues:

- there are very few roles that are entry in cyber that aren't a large pay decrease for the SWE for a year or two to take. So, many don't take this jump unless there is a clean pivot into appsec or infrasec. Companies needing both of those are small, you largely only see this pivot in tech.

- IT teams don't particularly want to lose their headcount, so outside of excellent manager or very self-steering IT eng, nothing in IT is helping the aspiring sec eng make the jump over.

The end result, to solve this problem

> Does a person develop or approve security policies, devices...How are you determining if something is an incident...

is it's not really solved in a clean way. There's a massive talent gap and favorable mid+ sec eng employment market because of it. Cybersec is already experiencing it, LLMs will make it worse, and I think it'll get worse for devs as well ("How are you determining a performant app if you've never built an unperformant one and fixed it?")

// which is a long way of discussing

> You will start to see this turn around as companies realize...

it hasn't turned around in cyber fwiw and it's been growing for probably 2 decades, 1 decade in earnest. Perhaps b/c SWEs are a profit center vs. the security cost center, there'll be motivations though. IMO the only thing driving sec eng hiring isn't companies realizing career pipelines are messed up, it's regulations or getting hacked in profit-damaging ways, and there aren't a ton of companies in those buckets


  > it hasn't turned around in cyber fwiw and it's been growing for probably 2 decades, 1 decade in earnest. Perhaps b/c SWEs are a profit center vs. the security 
  > cost center, there'll be motivations though. IMO the only thing driving sec eng hiring isn't companies realizing career pipelines are messed up, it's regulations 
  > or getting hacked in profit-damaging ways, and there aren't a ton of companies in those buckets

I don't know from my observations cybersecurity has only been a thing in the last decade outside any defense industry. Before that it was information security and most operations/network security was done by systems and network administrators[1] with the driver being reliability of services verse any concern about the equipment or data on it.

While the hacks are a driver of the cybersecurity field the biggest driver as with all things is insurance companies and cyber coverage. Insurance companies requiring people to be dedicated on keeping up with vulnerabilities, secure default implementations, data restrictions is what is driving the need and companies just want to fill it to keep their coverage or keep their rates lower. Its the typical idea that if you add more software developers or people to a project it gets done faster, when in reality it doesn't work that way. This is why I think we will see a shift back to a more graduated source of cybersecurity professionals. There wasn't a formal path to being a systems administrator or network administrator compared to Computer Science degree -> developer.

Thanks for the astute discussion. Its much better than the "one" line bot responses that you typically see now.

[1] For all the young kids these jobs were renamed DevOPS, NetOPS, SRE, etc. Previously these responsibilities were just part of operating a network.


> Before that it was information security

Fair call-out. To clarify, I swap what I call the job depending on the audience, but IMO the underlying requirements of the job haven't really changed. A SWE/business audience - call it cybersec. At the security cons in Vegas - call it infosec. Obviously there's skill variations within the security needs of the day (i.e. pure "netsec" isn't around as much anymore vs. "cloudsec"). But, skill shortages have persisted across all these variations of the job IMO.

> insurance companies and cyber coverage.

I've primarily worked in tech or finance, and tbh I don't run into insurance topics a lot although it's of course speculated as a possible growing motivator for the field and related hiring. The issue and "signal" I look for with that changing is when will the Fortune 500-style mass data breach actually turn into (a) uninsurability or (b) massive fines. Neither have happened yet, but IMO this is changing.

In terms of security programs I've joined where there was an incentive to hire, it is always something like this, which is what I mean by regulations or hacks driving hiring in my (anecdotal) experiences:

- Want to IPO, Series C tech startup? Must pass SOC-2, must hire security team.

- Horrible hack or very narrow close call, largely stayed internal -> board/founders gets fired up about cyber risk, and it filters down to hiring out a security team.

...


There are a lot of posts claiming that advances in technology over the past decade or two have been leading to "productivity gains". But the economic data simply doesn't support these claims. Productivity growth has stalled in most of the developed world.

I like the way tech companies operate. I like slack. We don't use P2 but it looks interesting. I just don't really believe it'll turbo charge productivity at my company, much less at a more traditional company.

I'm going to avoid saying anything one way or the other about AI because that would be a separate argument.


Did anyone perform an analysis on where the laid off employees for last 3 years went? According to layoffs.fyi nearly 500k were laid off since 2021, would be interesting to see if people mostly reshuffled within FAANG or there was a more structural talent migration e.g. from megacorps to startups.


And how many haven't landed anywhere yet.


And after ~12 months or so, they are removed from the "unemployment" numbers as "no longer looking for work" so that the numbers look better.

ref: https://www.bls.gov/cps/cps_htgm.htm#nilf


No, you still count as unemployed even if you haven't had a job for 12 months, as your own link attests. The question is whether you've looked for a job at any point in the past year, not whether you've held one.

More broadly, the US government publishes six different measures of unemployment, all of which can be seen here: https://www.bls.gov/news.release/empsit.t15.htm . The media usually focuses on U-3 (currently 3.7%), whereas the broadest metric is U-6 (currently 7.2%).


Your comment us needlessly pedantic, as I linked the full answer and said “~12 months” which is “about twelve months” which is not incorrect, though I’m sure the average is probably closer to 18-20 months for being removed from the U3 stat.


Or didn't seriously try to. I know people who almost certainly did pretty well over the past 15 years or so, got caught up in some layoff, and retired or semi-retired--probably a few years earlier than they would have done in different circumstances.


That's kind of what I've done. I looked around some last year, didn't see much out there, had a few interviews here and there that usually ended in ghosting. After the unemployment ran out stopped looking for a while. Then started looking again recently because I'm not sure I'm ready to be retired yet.


There are volunteer or more-or-less volunteer things you can do but you need to find the right open source project (or start one), find a channel where people will actually read what you write and do it regularly enough, have connections to people who actually want your advice, etc. Once you're not connected to an organization some of those things are harder than they seem unless you have specific, still-relevant credentials.


Or who ended up underemployed. I know one guy who ended up stocking shelves at Home Depot to keep food on the table for the family until the job market turns around. Technically employed, but not where he ought to be.


As a software engineer, finding a job is harder than it's ever been for me, but it still seems _easier_ than my peers in other industries.


Depends on the other industries, I guess. It's pretty easy to find service jobs right now. The other thing I'm noticing is that the hourly rates being offered for contracts are a lot less than they were 2 years ago.


Paying less for an engineer in a low cost of living area may save you money in the short term but will wound you in the long term if you value employee retention. If you have a truly skilled engineer paid cheaply, it’s trivial for a large company from a high cost of living area to offer them a 20% salary bump and snatch them away. Good luck if that employee had valuable insight into company processes.


> At tech companies some roles are highly leveraged… These leveraged roles can create enormous amounts of value, but the unlock in technology can come from a single person, a single insight.

This core idea about tech employees having high leverage is spot on. It not only results in the high compensation that tech enjoys at the moment, but also explains why attrition and layoffs are a part of the process. Companies with a lot to gain from technology are willing to throw money into hiring, but are simultaneously figuring out which roles are actually paying off and which ones aren’t.


Another obvious cause I don’t see anyone mentioning: tech simply created a lot less value in the last decade than in any previous decade. Pretty much every one of the big hyped up technologies that was going to be the future of everything turned out to be a bust: IoT, drones, cryptocurrencies, NFTs, and on and on. Even the ones that got some traction like ride sharing really were just castles of smoke floated on giant clouds of dumb middle eastern and Asian venture capital money looking for a greater fool. Most of the changes in web development tech have made software slower and more bureaucratic to make in order to support large FAANG-style software orgs.

There was absolutely nothing remotely on the scale of the smartphone, social media, the web, the personal computer, etc from previous decades in terms of actually creating real value for users.

Today’s tech industry deserves to shrink.


A few of those thoughts I completely agree on: results of overhiring, and technology enabling bigger impact by fewer people seem spot on to me.

But a few I find strange. For example, a semi-implicit assumption that hiring, profits and stock price are tightly coupled. That is not always the case. They are correlated, but often have significant time lags, so it is possible for a company to hire rapidly on a great novel idea, realize a first mover advantage, have it lead to profits, have that lead to a stock rise by which time there is no longer a need for aggressive hiring. The company does not even need to do layoffs, just a stop to aggressive hiring is enough to feel a frozen job market.

Edited: apparently I totally misunderstood the meaning of the "non-regrettable attrition". Is the only difference with a layoff is firing employees one-at-a-time vs en masse?


> Is the only difference with a layoff is firing employees one-at-a-time vs en masse?

Theoretically, layoffs are not performance-related, although sometimes they are. Getting laid off doesn’t necessarily reflect poorly on the person who was laid off.

Firing is performance-related, and it’s usually a bad look.

The former may get severance and will probably be eligible for unemployment, while the latter will likely get neither.


Are you sure about the unemployment? Given that it is usually administered by the state (at least in the US) I am surprised that any non-criminal involuntarily termination would be rejected for the unemployment benefits. But IANAL and personally never needed one, so anything is possible.


> Are you sure about the unemployment?

Definitely not sure, and it probably varies based on location.

In CA, where I am, one criterion is “unemployed through no fault of [their] own”. Getting fired is widely considered to be the individual’s fault (e.g., non-performance). I’ve heard something about the standard being “egregious misconduct”, but that’s just beyond the details that I know about.

Getting laid off usually isn’t considered the employee’s fault (e.g., due to lack of work or change in direction).

Iirc, companies might have increased payments into the state unemployment coffers when they do layoffs versus firings. I have only heard about this second-hand, so I don’t know the details.

Anecdotally and perhaps only tangentially related, I have friends and family who have had to or chosen to fire people, and they have all said the same thing — layoffs are much easier than firings.


> Getting fired is widely considered to be the individual’s fault (e.g., non-performance). I’ve heard something about the standard being “egregious misconduct”, but that’s just beyond the details that I know about.

I have no real information, this is just my handwaving, but I do not think this is the case. Most employment in the US is at will for both sides. That is, an employer can terminate an employee whenever it is beneficial for the business (and an employee can leave with no warning even if he is needed by the business). No "fault" is required on the part of the employee -- needs and priorities of the business could have shifted, for example.

I think most employers would take this route of "no fault termination" if at all possible. Otherwise it would make sense for the employee to fight this, file complaints, etc. incurring potentially large costs for the company on time, effort and the PR fronts. So they tend to terminate employment with a bland corporate legalese instead of a red flag like "misconduct". Being terminated by a previous employer may cause some concern when job searching, but unless it actually comes with a paper trail of "gross misconduct" or some such it should not affect unemployment benefits. My 2c.


May be time for Unionization in Tech to stabilize things and protect tech workers? There is already unionization in some parts of Europe.


Unions only work in locations where capital is immobile and the labor to operate the capital is immobile. Tech labor is the most mobile on earth ...


I'm dumb, why do unions only work where capital/labor is immobile?


Because the capital is immobile (can't move) from the labor. When it is mobile, it moves away from the union, to a place without union.


As a piece of general feedback regarding the writing, I believe the use of the term 'leveraged' (5 times) is overleveraged.


I genuinely don't know what "leveraged" is meant to mean here:

> Tech-first companies are going to become leaner and more leveraged.

To me, a company being "leveraged" means they're in debt. I'm pretty sure that's not what he means?


I spend a lot of time talking to PMs and I believe people throw this word around like some sort of MBA gang sign. Whenever I hear it, I know I will be interacting with a bullshit artist.

But to your point, I believe the definition for this context is:

> lev·er·age: use (something) to maximum advantage.


People talk about efficiency a lot, but they miss the basic point that efficiency in a free market means lower prices and decreased profits (due to competition). Companies profits are soaring. Folks upset about lay offs are not against that type of efficiency, they are against bad faith arguments that's it's cool for leaders to make money on short term gains (fire staff) while paying no costs for long term losses.


I was laid off along with a quarter of staff last year.

If I’m being honest, all of these are true:

- I had life circumstances making me depressed and I get why I was included.

- Multiple projects I spent over a month on were just thrown away casually when management changed their mind. So I was being paid without creating value.

- I don’t think tech workers are overpaid per se, but most people are underpaid given the effort they put in or conditions they work in.

Even as I made great money as a tech worker, I was acutely aware that I wasn’t providing value to society. Or even arguably the company, even when I tried.

My favorite Orwellian phrase is “Human Resources”. In a publically traded company, headcount is another number to advertise in the game of stock evaluation. People themselves become speculative assets.

I still benefited from this system and most people in the US have it worse than I do. That said, I find the expansion-layoff cycle icky.


This is the saddest post I have read on HN so far. All three bullets you listed are not your fault.

* Being depressed isn't a reason to get fired, everyone has human emotions and issues and we're supposed to be supporting people in their time of need, not replacing them like some manufacturing equipment that broke down!

* Being put on tasks that management threw away later isn't your fault, it's theirs for mismanagement and not having a good strategy or tactics to achieve the strat.

[An aside: I was recently laid off from a position and the company still has me on a contract to maintain the code I wrote, which is 80% of what's keeping them floating now; so in that case I did a lot of useful stuff for them but they still laid me off!]

* Being overpaid is relative, and is also a management fuck up; if they were really spending too much and didn't just want to make a bit more profit then they would reduce adjust pay scales and offer new employment contracts based on those scales.

> I wasn’t providing value to society

FUCK THIS SENTIMENT! Everyone is important and provides value, even the losers sitting at home doing nothing with their lives have people they care about and who care about them. This fucking society! I fucking hate how people's sense of worth is tied to some stupid fucking corp that sells ads or makes profit from trading digital bits or pork bellies using digital bits. Have respect for yourself: you did the best you could do and some moron execs and managers fucked up leading to your layoff.

I would argue that providing value isn't even necessary, society will do fine if a few people are having issues and cannot provide monetary value to society! There are billions of humans, a few can take a break!


Thanks. I’m in a great place now. Appreciate the kind words.


I am glad to hear that, I felt your posting as if it were my own from a while ago.


> headcount is another number to advertise in the game of stock evaluation

This is something I've had difficulty squaring over the course of my career.

If a company remains small but makes boatloads of money because it builds a superior product, shouldn't that be rewarded more than having a higher headcount?

From the perspective of having run engineering orgs (and a few failed - but enlightening - forays into founding a company), I've learned to believe from my experience that having the smallest team reasonable to achieve great things brings the most positive effects: folks are closer to the outcome, feel a stronger sense of ownership, want the best for the business, and reap the rewards of accomplishment, which in turn motivates the team to achieve and innovate more, resulting in growing revenue for the company.

If I were an investor, I'd want smaller teams producing outsized results. Why would I want a gigantic team producing mediocre results?


Because "growth". A 100 person high performing org looks okay right now. But there's always the "what if" factor. What if you scaled your org to 200 and eek out even more gains for shareholders or start taking on new projects for more growth opportunities. Hedonic treadmill in a way. The baseline for good is relative.


I hear you, and appreciate the response. The bias of my experience tells me that the "what if" factor can be created by a single person experimenting. When the thing that one person created takes off, awesome - hire some more folks to support and grow it. New revenue stream. More ownership by the person who created it. Give folks the time to experiment.

Hiring more people than necessary is, in my mind, akin to gambling. I, personally, hate gambling, and that's probably where my bias comes in on this topic.

Tying it back to my original response: when I say reasonable team size, my implication is reasonable growth based on real results, not just rolling the dice.

To turn the perspective from an engineering leader to that of an engineer (the other half of my career), if I could practice and LeetCode my way into a FAANG and make more than I've ever made to be part of the headcount machine, why would I stretch and innovate? I'd just sit back, do what's expected, and collect.

It's a weird dichotomy, and I fully appreciate that I'm biased in my perspective, which may be detached from reality.

And, again, in the imaginary world where I'm an investor, why would I want a large team built of folks just looking to collect a huge paycheck by being part of the machine?


I found an inverse correlation between effort I put in and rewards. The more I coasted, the safer I was politically and the better the salaries.

It’s bizarre.


I can tell you have decade(s) more experience than me in life and the software field in general. So my perspective may be different, limited and different in some ways.

But really different people think differently. Investors are looking at it differently, they're willing and eager to gamble. VC basically gamble for 1 hit out of a 100 bets. But that 1 bet will double or triple their money, then its worth it for them even if they have to write off 100 million in failed bets -- there's probably tax benefits doing this.

I found out about this at one point. I asked a founder, "so investors are just gonna let the company go under? They're willing to write off 5 million in the funding they gave your company?" It was such a irrational thing to me. But looking back, it's not irrational from the VCs point of view.

I agree with you. Some things make no sense as you point out in your comment.

But the people are complex and often irrational, at least from an outside perspective. Imagine aliens observing us, what would they think about how we live and operate.

Reminds me of a Silicon Valley quote -- what a show, it's remarkable how much it accurately depicts the industry

> You're one of Peter's compression plays, huh? Uhh, one of? How many does he have? Not too many. Like six or eight. Ok. Why are there so many? You know how sea turtles have a shit-ton of babies because most of them die on their way down to the water? Peter just wants to make sure that his money makes it to the ocean.


I suppose it depends. If a company can show impressive growth with a small staff, that's the ideal. But at that point the market/investors will probably ask "if you can sustain this kind of growth without many employees, why don't you hire a bunch and grow even faster?" Of course, that's not necessarily how it will play out (mythical man-month, market size, sales capacity, etc.), but the investors will still get itchy.


A growth mindset is great for an individual, but not a business or whole-scale economy.

I worked as the only developer at one company, one of 5 at another, and one of 50 at the last. The small team is ideal: division of labor, but everyone still kind of knows everything about the whole software package.


In industry the chance you work on a project that actually makes money is probably something like 10% per job you have EVER had.

People making the actual money are truly rare. Most employees are doing speculative side projects, infrastructure of the company, field support, etc., but very few are on the critical path to a product that is actually cashflow positive ...

In my career of 30+ years i can claim i was making money in a profitable division that was cashflow positive for 5.75 years


I appreciate your objectivity, these are not the most usual things to say.


> I don’t think tech workers are overpaid per se, but most people are underpaid given the effort they put in or conditions they work in.

Sure, fair enough, but you being laid off didn’t uplift someone who was being underpaid, on the contrary it went straight into the pockets of investors who probably aren’t hurting for money.


> There’s been a weird accounting thing where companies put a lot of their compensation into equity, but I think that’s going away as investors are learning to better account for dilution and employees appreciate the fungibility of cash.

Interesting, is there any public data to support this?


> Some of this (tech industry) productivity gains just come from adoption of existing tools like Google Workspace or Office 365, issue trackers and version control with tools like Gitlab, Github, or Jira.

Why would other industries need version control with tools like Gitlab, Github, or Jira. These tools exist to solve complexity problems that arise in software. They aren't making the tech industry more productive then others. If anything, it's a huge red flag if a company is a little too obsessed with Jira.


I’ve noticed salaries haven’t kept up with inflation from the last four years. Could that be playing a role?

Maybe companies realizing they can’t attract employees with their budgets so they don’t bother hiring?


Does this article actually say anything? It seems like a tech bro founder who laid off 7% of his staff rambling and everyone eats it up for some reason? Don't forget the a16z quote because we aren't done shilling crypto


That said, after working with small teams. I observed Its kind of hard to maintain all team member enthusiasm and drive without feeling like getting burnt out.


When there is a one person $1B company someday, which I think is very possible, I believe it will come after a series of events that will make it obvious that such a thing could exist. For example, the existence of several multi trillion dollar market cap companies, plethoras of small companies worth several billion dollars, and probably a radical rethinking of compensation and capitalism in the economy.


I think WhatsApp stands out here, as they had 55 people employed when they were acquired for 19 Billion in 2014.

I know it was mostly stock but given how things played out, that means it was worth even more as it were (Meta stock only rose from that time period onward, for the most part)

We're already starting to see this become more and more possible with fewer and fewer people.


Probably the guy who masters how to make realtime interactive porn with AI will be the first $1 billion sole proprieter ... Don't laugh YouTube already has hundreds of StableDiffusion channels uploading MANY videos a day ..


He'd also probably win some Nobel prize for addressing human overpopulation.

I feel bad for any normal human who needs to compete for affection against a future, hyper-optimized AI sexbot.


One billion as in VC-pre dumb money round valuations? VCs, and Altman, would love that!


FYI, Automattic has lost more headcount in 2023 (proportionally and in absolute terms) than some companies that had public layoffs.

I know of several cases where individuals receiving stellar reviews were suddenly fired for “performance reasons”.

Call it what you want, Automattic has been doing stealth layoffs and it is disingenuous to pretend it has not been.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: