There has always been a bottom-tier category of software development, which is programming at the level of "tweaking wordpress websites". Not that all jobs at this level are doing this, but the technical skill is what it takes to setup, configure, operate, and tweak a wordpress website, which requires some understanding of how websites work, web servers, PHP, JavaScript, CSS, a plugin ecosystem, backups, etc. It is quite simple for a software pro, but light years beyond what someone untrained could do.
Also, this can be extremely high-impact to a business. Consultancies exist at this technical difficulty making lots of money but also delivering lots of value. Go into a non-tech business, notice how 1000 man hours per month are essentially updating Excel files, automate most of it with a Python or VB script, and save the company hundreds of thousands of dollars.
You don't need to feel attacked or nervous that these jobs exist. "Software [Developer|Engineer|Architect]" has no legal definition and the title is meaningless, which is why hiring developers has overhead where you need to check far beyond what the resume says to assess a candidate's true skill.
It's great that $50-80,000 salary jobs can exist for people who have a modicum of software development talent. I'm very happy for these people.
Is fixing Wordpress really bottom-tier programming jobs? I know people who do that as freelance consultants, and they have a lot of freedom in their lives. Compare that to some enterprise developers who get hired right out of the university to build exactly what they are told, when they are told for long hours until they turn 36 and get replaced.
I mean, the latter requires a higher education, which is something that doesn’t exactly rain on people below the middle class, it also pays better and the problems are more interesting, but I know which of the two I’d personally prefer.
I think he's talking about the skill level, defined roughly as how much background knowledge you need to perform the job + how much judgment & concentration + how much creative/analytical thought.
This doesn't necessarily correlate with either remuneration or work environment. There are plumbers in the Bay Area who make more than a theoretical physics post-doc and deal with a lot less stress, but most people would agree that getting a Ph.D in theoretical physics requires more skill than becoming a plumber.
I don't think plumber vs. theoretical physics post-doc is a great example.
Plumbing is critical skilled work, taking years to master, and is paid accordingly.
I imagine plumbing is also very stressful, at times, such as when you're being careful not to flood a building, or avoiding breaking something that would require ripping open a floor to fix.
Though plumbing guild training sounds not entirely different than academia. (And postdocs tend to get paid poo because the market, and their guild, permit it.)
This discussion is nonsensical. If a theoretical physicist is so over-educated that he can figure out how to parlay his skills and knowledge into a $200K+ job in Quant Phynance, insurance, or as a full-fledged Data Scientist, then he/she needs to reskill on street smarts.
It's not all about getting paid the most for some people. I think a challenge for the case of physicists that want to work in that field is that there are more of them than are jobs.
Compare to programmers (of all levels), where there is basically endless demand in most big cities, and in the top say 15 tech centers in the us probably a few 100k jobs are available in total. I'm sure Seattle has 50k jobs.
It is interesting how the trade profession system of apprentices, journeymen, and masters resembles that of the academic system of grad students, postdocs, and PIs.
Skill-wise, absolutely. Demand-wise, absolutely not. I'm a freelance developer now, but before I was a senior engineer at an international software company. I've considered several times pivoting to productized Wordpress consulting because of how easy it is and relatively scalable vs other kinds of software development. The level of demand for basic Wordpress setup/configuration is really pretty astounding.
> It's great that $50-80,000 salary jobs can exist for people who have a modicum of software development talent. I'm very happy for these people.
My impression was that these jobs have always existed but are disappearing, we used to call them support developers but that is a job title that seems to have disappeared lately, the jobs themselves seem to be disappearing in our rush to make everyone "full stack" interchangeable parts. Management sees the skill overlap and seem to think their roles can be replaced by real developers, which is a terrible term but I can't think of a better one, this may even be part of the problem.
The even bigger loss is that they could be great entry level roles for bringing less experienced people into the industry.
I work 9 to 6 on one of the most prestigious companies of the country, write high-quality C++ and Java for a complex domain almost on daily basis, and only make $15,000 a year.
$300 a month is astoundingly low. What can they be doing for that amount of money, I'm amazed the cost is living is so low. If they got remote jobs to first world countries they could make 30 times as much if they could do java or c++.
There's an agency here which employs developers for different sorts of client work. Some of them I talked to were doing Javsacript and others were .Net. The employer is Swedish.
Other employers are small businesses which have a need for a developer and pay the lowest they can get away with.
Cost of living is really low here. You can get cozy two room apartment for under $200 / month. There are a lot of "boarding houses" where you can get a room and a shared bathroom for $35 / month. A small studio apartment may be as cheap as $100 / month. Young adults here will often say with their family though unless they are working in a different city.
Yeah, but a 1000 a month for a remote worker who can do java would be feasible. Think about how much you could outsource with just one person's salary. Of course that's the fantasy of moving work away. I don't want that exactly, I'm fortunate and privileged to be an american programmer. But I need to justify my high salary by being very productive, careful, use good techniques and use good engineering practices.
And here I am debugging TCP stacks, reverse engineering custom network protocols, writing ISRs in assembly, and doing fullstack, embedded, and native development while making that same amount.
Just because a profession is challenging or time-consuming does not mean the pay will be commensurate. Those little Wordpress sites are often responsible for significant amounts of small business revenue and the people who keep them running smoothly are paid accordingly.
If you want to get paid more, find a position where your work has a direct and measurable impact on revenue or profit and your compensation will flourish. And if you're already doing that without the pay, then it's time to find a new employer.
It depends on where you are working and the industry you are working on. But if you are doing all of that, and not pay over 50k, it sounds like you are underpaid. The two factors that can lower salaries are lack of exposure to cs fundamentals and theory (usually because you don't have a degree), and working in a non-tech town where they don't pay as much.
which requires some understanding of how websites work, web servers, PHP, JavaScript, CSS, a plugin ecosystem, backups, etc. It is quite simple for a software pro, but light years beyond what someone untrained could do.
Most WordPress sites I’ve seen for small businesses are using some type of managed services where the person isn’t dealing with the server.
I worked teaching factory workers how to use a software, and can say that no matter how much education and training the majority of these people receive they are not going to become software developers, and it isn't just the older generation. It takes a certain level of abstract thought to be a software developer. The kind of software development jobs that a typical factory worker is capable of are quickly disappearing if they aren't already gone.
A friend of mine wanted to give back and started teaching programming to disadvantaged kids.
He was shocked to ultimately discover that for most of his students, they simply didn't seem to be fundamentally capable of the abstract thought required, and it didn't matter how much time he spent with them. He came away from the experience fairly disillusioned, and I also never expected the difference in ability to be so extreme.
I personally really wonder -- is this something "genetic", like an athletic ability? Is it something learned, but somehow acquired at a young age? Is it a mindset, that we just haven't figured out how to communicate? Or a question of motivation, if it seems "hard" to think that way?
I recommend to everyone to get outside your tribe and comfort zones and spend real time with people who are genuinely different than you.
For me, the biggest such time was being a conscript in my small European country. Not a great time, but holy shit, did it open my eyes to the existence of people I would never have thought were real!
Emigrating to California was comparatively a mild culture shock. Turns out I had more in common with muslim Pakistani software engineers than farm boys from half an hour outside my home town.
I work as a software professional in the US Midwest. A few times conversations about what the average American citizen looks like, and I remind my coworkers that it’s not us, and if they want to view a more accurate cross section of society to visit the DMV.
> Emigrating to California was comparatively a mild culture shock. Turns out I had more in common with muslim Pakistani software engineers than farm boys from half an hour outside my home town.
Are you are Muslim ? If I dropped you in the middle of Pakistan I am pretty sure you would have nothing in common with most people in Pakistan.
BurningFrog is not Muslim. BurningFrog was pointing out that religious difference between BurningFrog and BurningFrog's coworkers (muslim Pakistani software engineers).
>If I dropped you in the middle of Pakistan I am pretty sure you would have nothing in common with most people in Pakistan.
That's pretty much the point BurningFrog was making. BurningFrog has a lot of similarities with software engineers willing to move to California even if they're of different religions and nationalities. But BurningFrog was very different from "most people in" "[BurningFrog's] small European country". And BurningFrog is similarly very different from "most people in Pakistan".
To be fair, one reason I got along so well with people from every corner of the planet writing software in Silicon Valley was just that I had learned about strangers in the military.
When I first got interested in tech it was called "being a geek". Now with the broification of tech, and the influx of people from cultures that optimize solely for income and prestige rather than passion, it seems to be forgotten that some people are just geeks and some (most) aren't.
I think plenty of people wanted to be in tech for more money in the past too. And there were those people who were just happy to be able to play with computers and happily surprised to be paid well for it. It's not a new thing.
This is what bothers me about the concept of meritocracy. Sure, hard work is unquestionably a necessary component to success, but humans also require a large amount of capital investment (education, medicine, etc.) to reach their full potential; without that, they are doomed to the pool of unskilled labor. People are generally sorted into career hierarchies before they are even given a choice.
What is the solution? I haven't a clue. Maybe an education system similar to Germany's, though I doubt that would catch on in the US.
But I don't think that's really true in this day and age when it comes to programming. All the information to learn programming is out there. You have direct access to all of the lectures and materials of university courses.
I speculate there are more intelligent people than people with strong psychologies (those who are committed to learning), but by filtering for the intersection you're throwing lots of industry advantage away.
I also say that the problem people are bothered by is human suffering. Even if you don't like how meritocracy is conducted today, like maybe you feel that 50% of positions in society are not sufficiently merited, and even if you could snap your fingers and magically sort people by potential like an oracle, you are still left with a meritorious race where few people win and losers suffer bad.
As a society we should contemplate more deliberately the fate of losers.
I failed out of college multiple times. My life was bad and unstable and I had panic attack and depression. I also have difficulty visualizing future rewards. I have one of these “weak” psychologies.
I met my wife, who helped “smooth out” my moods, and went from someone who couldn’t hold down a job at Walmart to a software engineer making many multiples of what I did just five years ago.
Even with the stability it mostly allowed me to take adderall without horrible behavioral side effects like compulsive gambling.
You’re right though, for me I was basically forgotten and nobody cared at that point. It wasn’t until I hit rock bottom sleeping on my moms couch that things got better. I feel like I was pulled up out of a hell I didn’t know I could escape from.
Yeah, but then you still need the time and environment to be able to do it which can be hard if you are working long hours in two jobs or whatever to make rent.
Then you have the old problem that in many companies HR will still want a degree cert.
I think technical and professional education should be always available to people - it helps us all if people can reach their potential after all.
I would say that genetics have a large part to play. A certain proportion of the population just isn't wired to think this way, I don't think it's strictly an intelligence thing either. Some tests have been devised that do predict with some certainty which group a particular person belongs in but these are controversial - and it's understandable why.
The saddest part is dealing with those who weren't cut out to do this yet somehow ended up in the profession. They may well have missed their true calling in life and go on to wreak havoc in organizations where you get to witness their frustration and perpetual confusion on a daily basis.
I don’t know, I met a guy who is not a great dev but he makes bank, he migrates from job to job but gets compensated far better than he would at anything else. And the labor market is constrained by bodies in seats, he can hack out some pretty bad code and get himself stuck, but in a long period of time he’ll eventually pester enough people for enough obvious questions that he figures out the solution.
For him it seems way better than the alternatives, really. Dev work pays really well.
I will disagree with most of the comments and say that the barrier to entry in programming isn't abstract though or critical thinking but quirks, inconsistencies and obscurities. That is really why programming favors having an interest in it. Because you need to some extent be excited about banging your head against the wall, in one way or another. That is why very smart people with PhDs aren't necessarily good programmers. And why if you don't have some relationship with programming, or something related, you might just think it is stupid.
So I do think it is to a large question of motivation and time. That is why so many programmers, even those with qualifications, are self-taught. Of course it helps to be smart, but programming isn't unforgiving enough that you can't be average. So that isn't really the barrier.
All fields have quirks but I think software has among the fewest quirks, and at least software is very aware of its own quirks and tries to fix them. Law, medicine, or literature are by far more quirky subjects of study, areas of study which will never be amenable to the lens of something as simplifying as the lambda calculus.
I wouldn't be surprised if abstract thought/problem solving and critical thinking need to be taught quite young to stick. Sort of similar to how much easier it is for young children to learn new languages compared to adults.
I believe so, a large crux of the argument was that true immersion learning becomes much more difficult if not impossible as an adult because while children will just speak to other children normally, an adult will “tone it down” and talk to someone struggling with a new language, ultimately putting the adult at a disadvantage, since the adult might not hear as much natural language in the new tongue.
Just having the social, familial, and cultural freedom and encouragement to practice abstract thought would be an important first step, otherwise the teaching isn’t very effective.
A friend's been telling me to read a controversial book called "The Bell Curve" about the genetic basis of intelligence.
The reason the book's controversial is that it builds a case that intelligence has a genetic basis, and some races are smarter (on average) than others (although there's tremendous individual variation, and environment / upbringing plays roles too).
If that's indeed the way the world actually is, would that be a truth our society could accept?
On the one hand, I'm no racist, nor do I aspire to ever be one. On the other hand, I want to say that I aspire to a proper scientific worldview, understanding the world according to evidence and reason. If I dismiss The Bell Curve's argument, am I being properly skeptical of unproven claims, or am I just going along with societal pressure and my own wish to live in a world where race really doesn't matter?
I haven’t seen it mentioned here but isn’t childhood nutrition and things like getting enough iodine, avoiding exposure to lead, and not getting abused huge in terms of how intelligent someone might become?
We might not be good at augmenting intelligence but it’s pretty easy to ruin potential, and all of these things are far more likely to happen to a disadvantaged child.
So true. Always takes a bit of a spark to free your mind from the way the educational system makes you do things. Once you get away from the comfort of routine and embrace the danger of thinking in ways that the machine likes to discourage, the sky's the limit.
Not at all true. If you're on the coasts or big cities, things are better. When you navigate away from those loci, things get a lot worse, quick.
Homeless in my city has grown by 200% in the last 4 years. We regularly have new homeless show up in Bloomington, IN because we try to care for and support getting out of poverty and homelessness... However this is exacerbated by communities all around shipping us their homeless. Greyhound Therapy is a thing, and it happens more than you think.
The homeless problem is getting worse, by far. Wages have stagnated, rents inexorably go up, and overall cost of living goes up as well. Sure, some company makes more money, but the result too is people move. Sometimes it's to a different apartment, and sometimes it's to Main St. The big problem why "we" don't see it are the city, municipal, and state governments don't want to make a count of it. It's embarrassing - and the thought is 'If we don't count it, they'll go away'.
We in IT have been relatively immune, but I too suspect that will go by the wayside in the coming decades with the factorization and commodification of programming and IT administration.
I hope your friend was able to positively impact a few children's lives. It is a Nobel cause to do that sort of thing.
I think the programming skill you are referring to it some combination of all of the above. Many very successful people come from poor backgrounds, but going to a good school and having a steady household certainly increase the odds of success
I would agree. In addition to abstract thinking being a software dev also means to be willing to put up with a lot of boring and tedious tasks most people would find extremely boring. Also, constantly learning new stuff is not everyone.
Yes, staying in software development, even as a trainer requires a daily and constant investment of time to advance. This is not something most people can keep up for decades on end. At 61 years old, I still maintain "a daily student schedule" to learn SOMETHING new every day with programming. I must study daily to keep advancing, or at least maintain "even position" with my peers, and not be passed by because things advance so fast in software and biotechnology.
He teaches them. In my experience, only a few people in an undergrad EE EM class understand divergence and curl. Those that don’t probably won’t do well in certain EE disciplines.
When I was a young kid I came across a mini golf game for the computer (can't remember which platform, could have been Atari 130XE, IBM PC XT, or Apple iigs) that represented sloped surfaces as vector fields. So I knew if there were arrows pointing left, my golf ball going right would slow down and roll back to the left.
That kind of early exposure made it much easier to understand vector fields later on.
"Your job screwing bolts on cars disappeared? Learn to code and be a programmer." Great advice for the typical HN reader, but a lot of people who can learn to screw bolts on cars just don't have what it takes to be a good coder.
I don't know if it's due to innate intelligence, upbringing, environment, attitude, or what. But there are millions of people out there who just don't seem to be able to master the abstract thinking required. And I'm not sure they'll ever be able to, regardless of how much education we give them.
What should our economy do with those people going forward? For social stability and ethical reasons, we ought to give them a path to decent middle class lives. How do we make that happen?
I think the key's that from 1880 to 1980, we invented mainly technologies that pulled more and more people into the labor force. Now we've started inventing technologies that push more and more people out of the labor force.
How do we reverse the trend? What kind of technology can we try to invent that pulls in a huge number of people to work, the more laborers, the better?
> Training, mentoring and counseling people — often from disadvantaged backgrounds — is not a mass-production process.
Yes, yes it is. It literally is. Maybe you need some additional sorting of raw materials in, maybe the machines (the teachers, the mentors) have a relatively low number of cycles in their lifespan (number of students they can teach), maybe there's more sorting on the output (just like chips are sorted after production), but at nation-scale software engineers are definitely mass-producible.
Maybe so, but can you please not post like this to HN? We're trying for thoughtful conversation here, and that means being kind and responding substantively (or not at all), even when another comment is completely wrong.
If you'd read the site guidelines and take the spirit of HN to heart, we'd be grateful. We're trying to be less hostile and a bit more interesting than Internet Default, if you know what I mean.
"Today, Mr. Davis, 27, is a cybersecurity specialist working on an incident response team at the company. He earns above $40,000, more than twice his salary in retail."
Isn't that like what people at Costco make for running a cash register and being friendly?
And who do you think has more job security?
edit: To be less coy about it, if you are any good at this stuff and are only making "above $40k", you are getting f-ed.
Costco jobs are an outlier in retail as I understand it.
$40k/ year in Georgia may be a liveable salary, and if this is an entry level job for him there’s probably the possibility to increase that salary by a 1.5x factor or more over 5 years if he can continue career growth (not sure exactly what a cybersecurity job path looks like, but I assume there is some upward mobility or increased ability to transfer to other higher paying tech roles).
The explosion in cost of living in places like SF has severely distorted peoples’ views of what is a reasonable salary in the vast majority of the country. $40k/year is a good salary almost anywhere in the US. It’s above the median starting salary for college graduates. For a couple, that’s probably $60,000-70,000 in household income. In my county you can get a 3BR house in a good school district for $200-230k, which is affordable for a family making $60k. And I’m not even in a particularly cheap area (an hour from DC, and these houses are maybe a 15 minute drive from the Metro).
The median household income for a married couple in the US is about $80,000, so two people making $40,000. Although that’s poverty level in San Francisco, that’s a problem with San Francisco, not the country. It’s among the highest in the entire developed world, and a very comfortable life in most places. (Particularly so if, like many Americans have done over the last few decades, you move to a sun belt state where your dollar goes a long way.)
"3BR house in a good school district for $200-230k, which is affordable for a family making $60k"
Maybe if you have no children, live in a low-tax area, have an awesome deal on homeowner's insurance and a fantastic health insurance plan, and neither you nor your spouse ever get laid-off or injured.
With 10% down at today’s rates a $230,000 house will cost you $1,200 per month in mortgage, taxes, and home insurance. Little bit less with the mortgage interest deduction and real estate tax deduction. The rule of thumb is 36% of gross income, which is $2,000 on $70,000. The rule of thumb cuts it close, but this is well below that—20% of gross income.
Net pay on $70,000 in Maryland for a married couple is $4,750 per month. (Maryland is not a low tax state.) At that wage level, your job quite likely includes health insurance. (Excluding people who are on Medicaid, Medicare, or military insurance, 84% of people in Maryland have insurance through their employer.) But let’s pretend it doesn’t. For a family of four with two kids making $70,000, an ACA Silver plan in Maryland is $538. About $1,700 for housing and health insurance combined, leaving almost $3,000 a month for saving and other expenses. That’s a very comfortable middle class life.
Say your earnings are split $40k/$30k, a typical split for a married couple. The higher earning person losing their job will reduce the household income to $50,000, raising the housing expense to 29%—still within an acceptable range.
Original was "affordable for a family making $60k"
Drop the fam back to $60k, and that "$3,000 a month for saving and other expenses" drops to $2167--and I would put special emphasis on "expenses": few hundred a month for vehicles, a few hundred for groceries, occasional co-pays (that can get pretty big as you get older and have more things wrong with you), emergency repairs, school clothes, etc.
There will probably be some left over if both parents are employed--not a lot though. I guarantee that the average family will go deep into the red if mom or pop are unemployed for more than a few months in the $200k house + $60k/yr combined scenario.
I said $60-70k, the idea being that the person making $40k is married to someone making $30k. Nonetheless, $60k is 24% of gross income is still well within the recommended affordability range, and well within the 31% lenders will qualify you to borrow. Even if one person is on unemployment (in Maryland it pays half salary for 6 months), that’s 32% of income. Tight, but on average that should last just a few months.
Even if we pop it back up to $70k--still too close if you ask me. Shit happens. Most people would be better off economizing on the house a little, and locking those savings in, rather than micro-managing nickel-and-dime expenses.
What the lenders will do is neither here nor there--given that most will do just about anything for a short-term buck. We all saw how that movie ends back in 2008. I just happened to see it over and over again.
What is so objectionable about telling people to give themselves a little more wiggle-room?
A $230,000 house on a $60-70k income is giving yourself more wiggle room. At $60k, you could qualify for something closer to a $300,000 house. That’s what most people do—yet the default rate for prime borrowers stayed below 5% even in 2008 (and is around 1-2% typically).
No need to guess. In most states, unless you're retired, the property tax is the property tax.
I sold real estate in Alabama (with some of the lowest property taxes in the country) 15 years ago. Even back then, on $230k, you'd be looking at around $1500/mo at today's rates, ~$1500/yr property tax, ~$2000/yr homeowner's insurance.
Unless a person is 100% OK with being a wage-slave, cracking a $21,500 nut every year--when you've only got $60k/yr pre-tax--is not a good situation.
For a 230k loan (which means zero down payment which is unusual) to be $1500/mo you're talking about 6.8% interest. Rates are not that high on a 30 year mortgage.
Out of curiosity, what compels people to use terms like "liveable salary"? ("Living wage" used to be more common, but whatever.)
The median salary in the US is less than 32,000/year. Do you think half the country is failing at "living" in anything but a totally fantastical definition of the word "living"? It's a cheap propaganda term--obviously--so why use it? Is it just habit?
Median salary may be less than 32,000 but keep in mind a lot of people get welfare assistance. EITC, food stamps, Medicaid, and many other programs are available for the people making very little.
Personally I started my career on what was probably a similar salary to this person’s 40k (adjusting for inflation.) I had my own experience in mind when I used the term liveable - thinking about being able to afford an apartment on my own and eventually save up to buy a car rather than use public transit/bike exclusively.
Obviously, people do make do with less. Sometimes that means stuff like spending 4 hours a day commuting on multiple buses or something, or juggling multiple jobs and increasing stress levels, etc...
I know some fairly credentialed people in the same field. Their pay is much better, but the job security seems very tenuous--especially for something so joined-at-the-hip with the defense industry.
I'm tired of the continued devaluation of my profession.
Software development (I hate the term "coding") is not a manufacturing job, as much as some company wants it to be. And apprenticeships at body shops isn't going to create software developers, it's just going to create terrible software with obvious rot and decay as the years go own.
I don't understand why your post is so defensive. For years, "tech jobs" were marketed as a way to the upper middle class for a broad swath of people (in America and elsewhere). This was not the doing of "two-bit journalists" (and Steve Lohr has a track record of pretty good work!); it was a concerted effort by tech companies.
It's turning out to not really be the case and the successful stories are comparatively few and far between, while the rest of America falls further and further behind for reasons largely not of their own making.
What should be talked about is what's actually the point of jobs. It's not as a means to get work done so corporations can generate profit. Jobs are a tool to organize society. Profit is a side-effect of that but an irrelevant part the primary function of jobs. This concept of jobs as a tool to organize society rather than a way to solve problems using human "capital" or humans as machines or tools for problem solving which can then be replaced by other machines which can solve the same problems cheaper needs additional studying. As using humans to solve problems for corporations is becoming more and more irrelevant.
I find this a little bit confusing. The primary function of jobs is that it's hard for workers with specialized skills to extract all the value from their labor alone, so they get together in organizations to do it, and the people doing the organizing (and capital investing etc.) get to skim a healthy chunk off the top (too much in our current economy) for doing so. Is this the same as what you're saying, or different?
> What should be talked about is what's actually the point of jobs.
I fully agree, FWIW. Perhaps it's where my interests have typically lain, but as a developer whose primary job has been to automate other tech people--sysadmins mostly, but developers as well--this has been on my mind for about a decade now.
You mean a tool to organize society like castes were? Or a tool to organize society so that it produces enough medical care and has a large enough army? The first one really doesn't fit with how dynamic jobs are, as they can be created at any income level or sector regardless of the impacts to social class.
I believe you have it reversed. The initial (main) point of a job is so the corporation can generate profit. The secondary, positive side effect, is that it can, sometimes, organize society.
Hate to break it to you, but well trained software engineers create terrible software too. Everywhere I’ve worked had some really boneheaded things going on. A reasonable person showing up with a keen eye to making things better can contribute a lot. We had one software engineer at YouTube just hunted down and deleted unused code. This reduced the maintenance complexity and generally made the codebase more maintainable. Was she a rockstar? Probably not, but every day she made the codebase better. For the most part, showing up, being reasonable and having a handle on the basics goes a long way. I’ll take that over a hotshot working on the wrong problem any day. There’s nothing like the smart man monkey trap to make otherwise intelligent people cause more harm than good.
>apprenticeships at body shops isn't going to create software developers, it's just going to create terrible software with obvious rot and decay as the years go own...
Well, if we're intellectually honest as software developers, we also "...create terrible software with obvious rot and decay as the years go on [sp]..."
The devaluation of the profession happened a few decades ago, otherwise most of us wouldn't have been able to join the industry. Especially in the last decade most people in the industry have been cheering on the abstraction of what you do from what you make. At some point someone is going to call the bluff. I think people who join the industry now can do very well. And they probably see it more as a profession than most of the industry with all their social conventions. Then we can go back to the profession being valued for what it actually delivers.
I agree with your hatred of the term "coding". It rings like "typing" or having typist skills. It demeans the many years of time it takes to learn real software engineering skills, which includes proper software requirements analysis, design, testing, packaging, and deployment.
I work with a bunch of data analysts teaching them programming for data analytics and machine learning. If I use the term "software development" the managers think it is out of scope for their job ("that's something IT does, not our group"). But when I use the term "coding" the managers don't seem to mind. They need to grow up and appreciate what it takes to develop analytics software.
I like what you said "apprenticeships at body shops isn't going to create software developers, it's just going to create terrible software with obvious rot and decay as the years go own." These junior apprentices produce stuff that barely works, and it is rotting crap whose "technical debt" never disappears, and eventually requires a senior software engineer to rewrite it after it fails.
Just like manufacturing jobs, some software work could be completed by uneducated felons, and some work could only be done by PhD specialists. In fact I work at such a place that has an extremely wide gamut of both software engineers and production staff.
I would say the statistics would probably support your comment (besides the binary "nobody" and "only"), despite its obvious sarcasm. But that wasn't really my point. I had no intent to criticize certain classes. My point was that software people should get over the exceptionalism.
Sure, the professional developers are paid well. However the more Jira and agile ideas takes over, you become more replaceable. Why? You're just a card-filling machine dealing with feature requests or problem reports.
And sure, the next developer may fill less cards, but they will improve with time.
And if you don't work in a protected area (federal, etc), then you can buy card-filling devs for 1/5 to 1/8 of US... And have them QA'ed for less than 1 USA dev.
Face it: I, a sysad, am a service worker. And devs are line workers. We're well paid, but so were factory workers at one time.
I disagree with this completely. Development is not line work, and doing it like line work creates unmaintainable, expensive crap. "The Mythical Man-Month" explains this well. Conceptual coherence is of the utmost importance in effective software systems, and you don't get that by just knocking out stories one-by-one like they're desktop support tickets.
Updating strings in database/code is line work. Update images on website is line work. Debugging basic issues is line work. I've gotten these tasks as discreet Jira issues, so someone thought it was important enough.
It's fine if line work + hardcore CS positions exist simultaneously. This fighting over the boundaries of where the profession starts and ends is tiring and we should just accept there's a stratification of developer skill levels and start adjusting our attitudes and training accordingly. Development is all of these things because development is a huge job market.
You can't abstract your way out of doing the line work and only have hardcore CS positions in the job market.
> Development is not line work, and doing it like line work creates unmaintainable, expensive crap.
Huh, it is already happening. I see everyday so many web services are created via auto generating scaffolding tools like Spring boot etc. One just fill in environment details, bits of JSON parsing and service is ready to be deployed.
Good for you that where you work people are intelligent enough to read Fred Brooks. Most places, including the ones with fancy gleaming buildings, open office floors, lounges with couches and organic smoothie bars are just making CRUD services with some cloud backed data storage. More so terminology I see at work is more inspired by factory floor not some intellectual venture.
People like us who spend their free time on a message board about programming know it's best not done as line work. But the general population doesn't, and probably many developers don't.
There is constant, never-ending pressure to turn it into line work and we need to be aware of it. A lot of software running (barely) in the real world is written this way.
While I agree with you, it's also true that the majority of software in the world is held together with bubble gum and duck tape. That doesn't mean it should not have been written - it solved some problem for someone.
We should not build complex systems that way, but plenty of systems are not very complex.
To write good code, you need to understand the whole program (or at least a very substantial part of it), why certain decisions were made in the past, the potential ripple effects of a “simple” change, etc.
This means that it’s impossible to just swap devs out like factory line workers, otherwise you’re just creating more bugs.
For similar reasons, you can’t paint half a canvas and then hand it over to another painter and have him finish it...
> Sure, the professional developers are paid well. However the more Jira and agile ideas takes over, you become more replaceable. Why? You're just a card-filling machine dealing with feature requests or problem reports.
Agile do not make developer replaceable. If anything it empower developers to become on equal footing with everyone else.
> And sure, the next developer may fill less cards, but they will improve with time.
I saw projects outsourced to huge teams in India failing to deliver. Reality of software development is that unless you have right mix of experience, skill and people you might never deliver.
Programming require both engineering and creative skills. Some class of problems cannot be brute forced by large number of young bodies.
Agile shouldn't make developers more replaceable, but in my experience that's the effect of Scrum when wielded by many kinds of management. Stuff like "story points" rigged such that a problem that would take a week must be reduced to smaller--preferably parallelizable--micro-stories that have implicit context but no explicit dependencies so that the decision of farming it out to many juniors can be upheld.
I'm not saying that this actually works better, to be clear. But I would bet it works cheaper for the time horizons that the folks making those decisions care about.
Splitting the store into smaller tasks is better. It forces you to plan and communicate your progress. It allows for adjustments in both requirements and estimation. In many business cases, this approach is better. The senior developer should outperform multiple juniors anyway, in most cases, it is seniors that carry progress.
Toxic companies will twist a good process and make it horrible. It is not Agile or Scrum to blame.
If you care more about career than team success, project delivery and skills progression you are already in a sad place.
"Commodify of application" is a good thing. This means that your team is flexible and can adapt to changes. It mean that there little of key person dependencies. It makes you go for holidays without worrying that your team will be able to finish a new service because you are one of two people that can navigate AWS :P
There will always be software development work that can’t be done that way because it requires specialized knowledge or cutting edge programming techniques.
Yes, like making handmade clothing, only certain cases require advanced skills like that. I think over time less and less work will require that handmade software. More can be done by putting other packages together. I've been luck to work on infrastructure work that has ended up having new ideas and advancements over existing systems that we could justify a new version. But I do feel like this opportunity to do new stuff could shrink. How many compilers, databases, os kernel writers do we need? The vast majority of people can make great use of an existing compiler, or database. Justifying making a new one should be hard.
Thats a funny way to look at it - if anything a sysadmin is more of a line worker (I do both dev and sysadmin so I'm not particularly invested in either), with enough software you could be replaced by BA's coming up with architecture requirements and a system like docker...
Peculairly enough, ive had our management ask how to show what we do, and how we allocate out time. And as many times we have ticket after ticket from Jira, nothing adequately shows how we work.
That's why I equated Sysads as service workers. Some assistance may be 5m, and others may have looked like a 10m issue but turns into 3h timesink. We're serving others, or serving machines for serving others.
Whereas I see Agile, or the Kanban system, to be set up to send work to either internal or external devs. The only reason why real factories don't use international work is because sending lines and parts is onerous on a short cycle... Whereas software Kanban seems fit for each other.
Sure there's dev work that requires specialty knowledge, but most dev work is a CRUD web app with a DB backend. And those jobs are especially prone to outsourced via a thousand cuts.
Thanks for that. It's amazing to look at those median job salaries and try to understand how salaries vary so much, along with cost of living. I'm in one of the coastal tech hubs and we pay bachelor's degree people over 130k. An amazing amount, but way over that median.
how can someone else "devalue" your profession? market forces decide the value of your profession. if you feel like your value is being diluted then I hate to break it to you but the truth is that you're not valuable anymore. you are welcome to upskill (in order to be able to deliver more value) or change careers.
like I love people praising the maw of capital (startups rule the world yay!) and simultaneously inveighing against. you have an issue? unionize (but you'll be excommunicated from the church of capital for trying to).
A profession can be improperly devalued by disregarding or demeaning the values provided by that profession. For example, one could devalue the dentistry profession by convincing a bunch of people that cavities are a myth and sugar is good for you, or the medical profession by convincing people that getting infected with the measles is a good thing.
Also, this can be extremely high-impact to a business. Consultancies exist at this technical difficulty making lots of money but also delivering lots of value. Go into a non-tech business, notice how 1000 man hours per month are essentially updating Excel files, automate most of it with a Python or VB script, and save the company hundreds of thousands of dollars.
You don't need to feel attacked or nervous that these jobs exist. "Software [Developer|Engineer|Architect]" has no legal definition and the title is meaningless, which is why hiring developers has overhead where you need to check far beyond what the resume says to assess a candidate's true skill.
It's great that $50-80,000 salary jobs can exist for people who have a modicum of software development talent. I'm very happy for these people.