Hacker News new | past | comments | ask | show | jobs | submit login
As I get older, I just don't care about new technology (reddit.com)
174 points by redbell on Aug 26, 2023 | hide | past | favorite | 176 comments



Title is misleading. Not getting excited about a new Next.js version doesn't mean you don't care about new technology. Maybe it just means that you don't feel like reorganizing your code or something. Relatively speaking it's still an incremental upgrade to your approach overall.

I am 45 and stopped caring much about new web frameworks many years ago. It's not that I don't find them interesting or even advantageous in some ways, I just don't feel compelled to bother learning in detail about most of them. Although I usually do look into them enough to get the gist.

You just can't pick up every single new thing. If you try then you won't ever get any real work done.

But some things you really have to take notice of. And this is subjective and usually not one right answer. For me, image and text generation was something I felt compelled to jump into.

A big part of this calculation is also social. If you are looking to pick up contracts or jobs, your choice of technology is important. And often picking up the latest trend will make that easier.

It really depends on the technology and what it offers you. I would say that getting familiar with the OpenAI API provides much more bang for your buck (depending on what you are trying to do) versus upgrading Next.js versions. That's much easier to get excited about.


I'm just exhausted by it. I've been writing front-end code since 2009 so I've been here from the early jQuery days, and I don't have the energy to keep rummaging through docs to figure out how this new special thing (next13/astro/alpine/blah/blah/blah) deviates from standard web technology.

I'd much rather conserve my limited energy for things that are inherent to the platform and actually exciting, like WASM or WebGPU.


Maybe offtopic but jQuery still kinda rocks for small proejcts. It's not the fastest nor does it do fancy compile steps to avoid the DOM but instead just provides an easy-to-use API for most common tasks. And as a bonus the documentation is awesome compared to mpst other new-ish JS frameworks.


There’s really no reason to use jQuery at this point. It made sense when the native APIs were lacking and inconsistent between browsers. That’s not the case anymore. Just use vanilla JavaScript.


The big reason to use it now is that there are useful libraries built of top of it, and you already know it.

The value proposition has changed though and there isn't a compelling reason to start using jQuery now, but if you are somebody who just stuck with it this entire time it still works and you are still as productive as you ever were.

And as a bonus you didn't waste any time chasing failed fads along the way...


Once you've seen 3 frameworks, you've seen them all. The novelty wears off, rather quickly. That's the difference between your first encounter and the 10th.


Hell, I stopped carrying about new web frameworks when I was 25.


Genuine question, as someone who got exhausted and is out of the industry for the foreseeable future. How do you stay competitive or job hop when every post is looking for the latest web framework?


Don't choose hamster-wheel career tracks in the first place. Try to find Lindy paths [1], [2]. For example, SQL experts and DBAs. The rise and (mostly) demise of NoSQL actually cemented SQL's reputation as an irreplaceable technology. The latest Web framework is by definition non-Lindy: it's the new kid in town who think they know it all. grep is lindy, Unix is lindy, the qwerty keyboard is lindy, C++ is lindy, Computer Science is lindy, algorithms and data structures are lindy. Knowledge of the real world, whether it be finance or shipping, is lindy. Sorry, verbose reply, but I hope it helps.

1. https://luca-dellanna.com/lindy/

2. https://en.wikipedia.org/wiki/Lindy_effect


Interestingly, I've been thinking it would be a good idea to brush up on my C skills, because while there are serious contestants to its throne, a gig based on Rust today often means being at the cutting edge of tech, and will probably come with a lot of buzzwords like GraphQL, React, Typescript, monorepo, AWS Lambda, daily standups — things giving me symptoms of burnout just writing down.

Whereas a C gig today probably means embedded, or subsystem far away from the frontend churn. If I'm lucky they have unit tests. The Lindy path you were talking about.


The demise of NoSQL is greatly exaggerated. It's past it's hype cycle, and that's fine.

Many of us use it quietly, to do good work, and there is no need to crow to the world about what is happening under the hood.


My team was asked how we should store data that is variable in structure and depth. When I suggested using mongo and not trying stuff it into a SQL db, I was looked at like I had suggested blending my firstborn for margaritas. People seem to think that NoSQL is dead and was always a bad idea.


The NoSQL movement was absolutely acrid towards SQL. The blowback is justified, IMHO. "blending my firstborn for margaritas" is very colorful language, woah :-)


Plenty of IT jobs out there that have job security to last your entire career and don't shift to the new shiny every few years. Utilities are a good example. Lots of older tech and some newer stuff, but nothing experimental.

Ex: Maintain some SQL queries, upgrade to a new database version without bricking everything, write some scripts that get fired off at a scheduled time, knowledge of the industry you're serving, update some configuration files for the company website which won't change for another decade, read some log files, create some visualization displays for operations staff, install software on certain machines, manage the company's VPN stuff, configure Linux, do the paperwork...etc

All this stuff is more generic IT and less rockstar developer. It reliably pays the bills and isn't very flashy.

There are plenty of people in IT that write code every day, some that rarely write code, and some that basically never write code, but just manage the business side of it all (there is a lot of beauracracy in large organizations).


> How do you stay competitive or job hop when every post is looking for the latest web framework?

Pretty sure you can get a job in frontend dev if you know React, which is 10 years old.


Map territory distinction -- what they are looking for (strong engineers regardless of stack) and what the role is (a description of what you'll be doing) are two different things. Find your way to doing and articulating impactful projects that use technology to measurably create leverage and value for the business, and you will be fine. Of course, that can be easier said than done.


Stop looking for work in tech places. It may require you to take a pay cut and work in a place viewed as a cost center though.


I'm honestly thinking about just picking up c++ or something and turning my attention to embedded systems. I'm sure there's a wealth of old tech sitting around that needs to be maintained.


I'm in embedded, there's also cutting edge stuff to work on. The nice thing is you can be mostly detached from annoying web apis and frameworks and databases. The not so nice stuff is dealing with crappy vendors, using crappy tool chains, and spending a ton of time setting up automated builds. Sometimes I think about switching to true embedded (I do embedded Linux) but that tends to pay less


i have never cared about them


And I always despised them!


same, complexity for complexity's sake.


To be fair, some of those frameworks were compensating for JavaScript which had real issues as a quickly-becoming-popular language.

SPAs were an answer to cross platform and device issues, given the web browser was the best cross platform piece of software used by the largest population. It made sense to try that vs making native apps on all the different platforms.

The fundamental problem in the web space spawning all of these tools has always been JavaScript’s weaknesses though.


I was there friend. I didnt agree, but hey, I left and we to "apps" for 10 years...


I don't care about new tech per se, I care about solving problems for me and my customers. When new versions of libraries/frameworks/whatever do that. That's exciting.


Yea, either you pick the tool that is most suited to the job or pick the tool that you are most productive is.

I agree about your sentiment about learning this that are important . I made a natural language network scanner and I got to speak about it at defcon. I never intended for people to find it interesting I just wanted to learn to apply LLMs . The source code is at https://github.com/zitterbewegung/saturday and you can interact with it at +1 (825) 251-9142


How do you know the tool which is most suited to the job if you never spend time learning new tools? And the tool that you are most productive in suffers from the Innovator's Dilemma - a new tool might make you more productive in the long term but will be a drop in your productivity in the short term; if you switch you'll lose what makes you competitive today, if you don't switch you'll lose out to people who are more competitive than you.


Does anyone really care about Next 12 -> Next 13?

On the other hand the rise of AI is an interesting thing. And Mars rockets and ebikes are cool.


There isn't a whole lot of new technology around. Apart from LLMs, everything else is a remix/retread/clone/speed bump with a few tweaks.

iOS is 16 years old. Android is a year younger. AWS is a year older. The original Oculus Rift is from 2016.

Facebook dates from 2004. Twitter from 2006. Amazon has been around for 30 years. (And looks it.)

Starlink is a consumerised reinvention of Iridium (1997).

Docker - 2013. Git - 2005. Github - 2008. React - 2013.

And so on.

Also a non-trivial wave of 80s nostalgia, with endless reinventions/emulators of 80s products or 80s ideals (Amiga, Spectrum, Atari ST, even RPi in its own way.)

Genuinely new? AI is pretty much it. (And maybe electric cars and scooters.)

Quantum computing and robotics are going to make a difference soon-ish, and likely lead to some innovations in User Space. But they're not consumer technologies yet.

The big change for me is weariness and enshittification. I used to be as enthusiastic as anyone, but what we have today seems like a nightmarish digital dystopia dedicated to mass surveillance, behaviour modification, and relentless predatory pursuit of profit.

This covers the range from the big adtech leviathans, to the mid-tech AirBnBs etc and their catastrophic effects on local economies, to the thousands of crappy addictive phone games that people waste their lives on.

There are some good things, but far less than I hoped.


You forgot electric mountain bikes, eMTB's.

Maybe not a big thing for everyone, but for the nature-loving nerds at the intersection of computers and mechanics this has personally been the most exciting hobby for the last 5 years.

The whole MTB industry is exploding with innovation, just recently Pinion released a motor with integrated gearbox getting rid of the 100-year old derailleur.

https://youtu.be/YXmMV1LQu-s?si=xfHIfBTPeYfGar5X

https://youtu.be/H3KsWIz4LDo?si=j6gMUylkr12GetPN

Better battery tech and sleeker integration/improved weight is creating some truly amazing new mountain bikes and the experience of riding one is quite novel compared to anything in the past.

For the uninitiated, it's not like riding a motorbike. These bikes are pedelecs meaning you only get extra assistance when pedaling (there is no throttle).

The best analogy is the feeling of being a bionic man with extreme strength and stamina. The first time riding up a steep hill brought the same sensation as riding a bike for the first time as a kid.

Pure bliss, I couldn't stop smiling.


I'm mostly a software person and electric bikes are largely illegal where I live, so I might be biased, but isn't electric bikes just a kind of "incremental improvement"? I mean, I totally agree that it makes the biking experience much better, and those who bike might find the experience revolutionary, but the "technology"... seems to be rather incremental than new.


The endless pursuit of profit at any cost, normally by exploiting the ever living hell out of everyone, is causing us all to have a shared existential crisis. Well, some of us are anyway—-there are plenty of folks that think it’s just great as it “aligns with our base nature” or some such nonsense. I hope humanity can figure out a better system after we hit rock bottom “spiritually.” Meanwhile, our media owned by the exploiters will keep pointing our attention at other things to be angry at than what we should be angry at.


What better system would that be?


Keep chasing profit. That's completely fine.

What's really concerning though are how so many players want to do so at all costs i.e. suck the air out of the building except for their own room.

THAT is not fine.

Play the game. Stop tilting the table however.


mRNA, Crispr-cas9, immune checkpoint inhibitors, etc.

Those are novel and at least with Crispr-cas9 one has been so universally recognized as novel that it resulted in the Nobel Prize at a much shorter delay than usual.


Yes! We switched from electronic technology to bio technology.


Where are the biotech equivalents of FAANG?


If you're talking salary banding, no companies in biopharma pay as broadly high, but the big ones like Pfizer, GSK, Merck, Sanofi, etc. certainly compensate high.


Good question.


Honestly, this is how I've been feeling lately. Everything is old now.

It's been a long time since a huge revolution tech wise happened. It's just incremental evolution now, and a lot of it is NOT for the better.

The Playstation 2 was as far away from the Atari 2600 as we are from the Playstation 2 today. One of these periods had a lot more change.


> The Playstation 2 was as far away from the Atari 2600 as we are from the Playstation 2 today. One of these periods had a lot more change.

Yeah, from PS2 to now is a much bigger difference.

Of course you meant in relative terms, but in absolute terms the computation (and especially per watt) that you can do today is astonishing relative to in 2000. Software development on the 2600 had more in common with the PS2 than the PS2 does to today. (Writing assembler in Excel for the vector units was a thing on the PS2).

What has happened is we have completely lost sight of what all this is for, and ever more of the created resources get allocated to enabling fast pivots towards anything that looks like it might be worth doing.


> Yeah, from PS2 to now is a much bigger difference.

Maybe in terms of development, as you suggest, but definitely not in terms of player experience. For example, I'm able to play and enjoy games from the PS2 (and even the PS1), but games from the 2600 are essentially unplayable. I have a lot of nostalgia for computers and games from the 80s but I realize now that it's for the late 80s, around the time of the SNES.

Sometimes I think that 1990-1995 was peak computing, in terms of "the computer as a feeling". 25MHz single core, 1MB RAM, 100MB HDD, 9600 bps modem, 320x200 256 colors--you could get a lot done and yet every byte and cycle were precious and it was all understandable by a single person. If consumer hardware had stalled out there, we'd still be flying high, albeit with a text-only internet. Software would be more stable and there would be incentive to optimize and improve instead of amass and churn. In fact the only things I'd change in that list to make it simpler would be SSD instead of floppy/HDD and packet-switched internet instead of modem.

Don't get me wrong, as a consumer I would want to stall out a bit later, closer to 1GHz, 1GB RAM, 16GB SSD, 1Mbps always-on network, 1024x768 true color; but by that point the hardware is complicated enough that the software becomes unwieldy, too much for one person to grok fully, and the craft is lost to waste and bloat. For example, with 320x200x8, there's a certain magic of having byte-addressable video memory laid out in a way that directly corresponds to the screen pixels. You can still get evocative images but you have to care about each pixel. With 1024x768x24 you can waste 90% of the resources and be 'good enough'.


From a user perspective, I'm going to have to disagree. You have it backward.

Atari 2600 you were playing something like Pole Position. Gran Turismo 4 was released on PS2. Gran Turismo 7 for PS5.

What improved from Gran Turismo 4 until 7? Model detail? Better grass and reflections? Some improved physics simulation? It looks great, don't get me wrong, but it isn't a massive improvement in user experience for 23 years of development.

VR racing could be a riot. Maybe VR is that leap forward, but VR is another tech we've been trying to get going since the 80s.


Gran Turismo 7 already plays in VR.

Also disqualifying VR that way by stating we've been trying to get it going since the 80s is a bit silly. Most of the other "huge steps" we've made like the iPhone were too just incremental improvements from predecessors.


Go review the iPhone announcement. The original iPhone was nothing compared to the recent models. Amazon in 1997 might allow you to buy some books, whereas the amount of stuff you can buy is much larger today.

The only point I get from this is that interesting technology takes a long time to get to a mature point.


This might be true for pure SW and computer tech, but what about biological sciences? Or consumer markets?


In terms of biotech and medicine, we're still developing new stuff continuously. Computers being able to do machine learning are helping us push ahead where human manpower was once wasted on busywork.

You can live a full live with AIDS now, with little to no side effects, even with little to no infection risk. We have vaccines against various causes for cancer, and are vaccines for specific types of cancer are actually being tested.

We're in the human trial phase of rewriting a living person's DNA to solve genetic defects. So far they haven't been particularly successful, but the concept was mere theory just 20 years ago.

We were hit with a pandemic and had working vaccines in less than a year. Even with vaccine deniers, we've managed to overcome the deadly wave of the pandemic in record time.

Things that were mere theories two decades ago can now be done inexpensively in a small lab. We don't get to tinker with it, but medicine and its adjacent fields are rapidly developing.

Computer science has become rather boring, although languages that bring novel ideas to the mainstream like Rust are still somewhat exciting I suppose.


LLMs too are a remix with a few tweaks. Most of the major things you listed fit under that bill too. React, just a better Angular. Facebook, just a better myspace. iOS just a better PalmOS.

This is the problem, you are expecting something brand new and revolutionary while looking at the past through rose coloured lenses. None of that stuff was revolutionary! It was only so because you were in a stage of your life where everything of this sort was new to you.

Couple this with the fact that genuinely new technology is always berated at first, and you've got a wonderful recipe for aging pessimism.


I think it's interesting that the time horizon being discussed here is so short.

I don't think I've cared about new technology... ever. I have cared about "new to me" technology at several points in my life.

When I was a teenager, the world was FULL of new (to me) technology, and I ate it all up. Programming in QBasic and Turbo Pascal, using hex editors to figure out file formats in my video games and mod them, building PCs from parts. Along the way to my late 30s, a few more new to me things produced excitement: GIS, Bash, Vim, Raspberry Pis, 3D printing, Linux containers, etc. I was years late to all of those parties, but still happy when I got there.

No amount of adopting the new version of an existing thing will ever replicate the joy and wonder of encountering something entirely new to you, and as you stay in the industry longer, there just aren't as many completely-new-to-you things to come across. Now, I have more excitement learning new things that have absolutely no connection to tech, because it's still novel enough to be exciting. Small engine repair, biology, and carpentry have sparked more wonder in my more recent years than anything I did on a computer.


What makes new technology exciting, whether it's a web framework, a phone, a car, or the internet, is all the things people can do with it that were impossible, or just very expensive, before.

In many parts of tech, we've had quite the run where there were leaps and bounds from one year to the next. Innovation curves slow down though: New phones used to really be substantially better than the year before, but now the new capabilities are mostly irrelevant for most people. Even something like OS updates used to bring in relevant changes, but not so much nowadays. Web frameworks, as the OP looked at, have not been doing anything remotely revolutionary in a long time. So anyone that is focused on just one corner of tech, and only looks there, is bound to start not caring about new tech. It really is not getting better.

To remain enthusiastic, one needs to keep looking in different places, and see improvements in new areas, or jumps in places that were stagnant. AI was relatively boring in the 90s, not so much today. Even something like Java had a decade of relatively inactivity, and now there are actual, relevant changes.

So yes, it makes perfect sense to pay a lot less attention to web frameworks right now. But if what you want is relevant novelty that is likely to change how things are done, the solution is to look elsewhere, instead of getting depressed about how modern UI tech is stagnant.


> Even something like Java had a decade of relatively inactivity, and now there are actual, relevant changes.

The key phrase for new tech is _relevant changes_. Much of the "innovation" in software is either NIH syndrome or ideological nuances that address 5% of the problem space. Or simply forgetting or not researching that people fought this problem ages ago. I know that this is HN but most ICT engineers I meet don't care a lot about the history of how we got here in the first place. Just about their puzzle here and know and latest and greatest toolkit.


Feel this, as someone running a modest solo side business.

FPGAjobs.com runs on a single Django app, with some Bootstrap-flavored templates on the frontend. Very little JS, ideally none at all.

It was a learning curve to implement Django at first (I’m not a SWE by training), but now that I know it, it’s fast and easy to add new features. You don’t get that kind of iterative speed when you’re constantly in the early stages of a technology learning curve. I’ll never give up Django. I’d rather sell the business to someone who knows how to scale it than switch tech stacks. That’s just not what I want to do.

It’s easier to get that dopamine hit of “ship thing, watch it get used” when you know your tools well enough to be able to ship and iterate quickly.


Just wanted to say I like your website and have used it in the past :)


Thanks very much. Please don’t hesitate to shoot us an email at fpga.RTL.jobs@gmail.com if you have any suggestions for improvement, or success stories from your job hunt, or anything we can do to help you find a new gig.


> I want to build things for people. Not constantly update my tools because other developers are bored.

i find myself getting frustrated really quickly. if i run a node command and it gives error, i have urge to throw laptop out of window or smash it.

web dev use to be about immediacy in the late 90s to mid 00s. now it feels like untangling wires for hours every day. frustrating and tedious.


I switched to primarily JVM (Kotlin mostly, some Java) a few years ago and this feeling has for the most part gone away. Having actually stable, actually good tooling/compiler/build systems makes a world of difference to mental health.


> now it feels like untangling wires for hours every day.

At least we've moved past webpack, that was a bundle of wires that couldn't ever be completely untangled.


Have we finally?

That reemergence of DLL hell on the web that every bright dev insisted was The Modern Way to code (or quit to somewhere else that enabled it) accompanied by spending my days pointlessly upgrading a mix of abandoned and pointlessly breaking packages then submitting forks/patches so they could remain in sync with each other was about the point that I gave up coding because I felt I’d stopped making an impact and moved to management.

We seemed to suddenly move from fast REPLs where we could iterate with the customer and solve problems while knowing most of the stack enough to code, deploy, and support, back to the pre-my-days compiling sword fights of old together with days to get undocumented boilerplate up and running with fancy pipelines that no one understood yet alone dared touch. I was never entirely sure what we gained but it certainly seemed to make coding both less fun and less productive yet everyone defended the new normal.

Then sitting as a manager after that dev career watching relatively basic form journeys takes teams of five 9 months to implement on the latest stacks when that used to be something a single junior dev knocked out in a week or two and the teams be shocked when in a crisis you could reason out the stack with them in a way that doesn’t seem to really be taught anymore (academically or professionally).

I’m not even 35 and I’m still a tad shy of 20 years professional experience but I already feel like a greybeard full of unbelievable “back in my day” stories when I look at the systems I considered and built resiliently a decade ago still ticking along, and still get easily upgraded as the environment and systems around them change. Meanwhile I watch new projects that would have been a small piece of work, maybe even just a ticket for a couple people, now take cross-functional teams of 50 up to 18 months to move through the org machine only to immediately have its replacement lined up as the next initiative because no one can face supporting/upgrading the “legacy” solution they just designed, built, and deployed. Don’t get me started on the way “those that can’t do” have redefined process around their own need for relevance and control.

With that said, I’ll concede to having burned out a little on that journey and currently being on a bit of a sabbatical where I’m toying with fun business ideas and other passions to reenergise and reboot creatively, so perhaps my cynicism is an anecdote rather than a snapshot.


> I’m not even 35 and I’m still a tad shy of 20 years professional experience but I already feel like a greybeard

35 here, started coding when I was 9. Even got into web pre-jquery.

Yes things take longer now. Yeah we need different ways of working too. But you know what's cool? Not being pulled into a super critical meeting with client executives at 6am because the site is down and you, the 23 year old kid, are the only person who knows how it works.

Team-based work made coding feel like work. And it brought new annoying toolchains and other shenanigans. BUT it also unlocked vacations and the ability to do a thing and move on to a different thing. I rather like that.

PS: all my references and cool jokes are out of date and my team mostly gives puzzled looks. The "back in my day" stories are completely unrelatable :(


What came after Webpack?


Enlightenment


Vite is pretty standard. Most importantly, it does a lot less by utilising new standards and a reasonable default config.


I think rollup was the first sane webpack replacement:, significantly less versatile, while significantly more sane.


I just moved to webpack like a year or so ago. It's so great! What is better? I don't know.


Unfortunatly it's not only web development, try setting up some microcontroller toolchains. It's all just really tedious, I'm not a professional so maybe the problem is me, but a lot of this tech stuff is just frustrating to work with. Slow unresponsive UI(vscode + platformio) slow startup times(stmCube or arduinoIDE) it's just frustrating sometimes. Whenever I can I try to work with just VIM and install/upload stuff via serial, which is it' own kind of hellish. :)


I have an Ardiuno-based clock kit where I added a firmware customization to support a GPS chip to auto-set the time.

If this daylight-savings change goes through in the US I'm going to have to build a new version and I haven't touched it in like six years, I'm sure that's going to go swimmingly.


Possibly related:

“I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.”

— Douglas Adams


For me the ages are more like <5 is normal, 5-20 is exciting and new, 20-35 is suspicious but I might still try it begrudgingly, and 36+ is nah I’m too for this. I’m in the last group now :)

Although I also remember being 18 and meeting some 70+ year olds that were very adept at using computers/the web and that really impressed me.


I don't know, I'm forty and LLMs are exciting as hell. If we're talking web frameworks, Django was the last good thing and there can never be any more innovation in that sector.

In seriousness, though, a technology that lets me do things I couldn't do before is great. A technology that slightly reorganizes my code and makes it so I have to throw everything I have so far away, I'm less excited about.

What motivates me is building things, which unfortunately means that I'll always reach for the tools I know, and only learn new tools if my current ones don't work, which is why I've been unable to learn Rust.

With Django, I can get stuff done really quickly, and people don't care that it's not a SPA, they care that they can solve their problem with it.


If you are 40 then you'd have been 33-34 when the "Attention Is All You Need" preprint that laid the foundation for LLMs became available.


Indeed I was.


After coding for more than 15 years, I can assure everyone that technology is always an enabler, and never the main thing (even if it pretends to be). The main thing is human interaction. We are humans after all.

Since "value" can either be a societal construct or a biological one (e.g. needs for food), it has now occured to me that it makes more sense to view technologies themselves as without intrinsic "value". Technology can only be a mean to an end, and not the other way round. (Or who is going to fund it? The military? Then it has to be military tech.. which is used to enable the strong stablization of a status quo, a currency, or the American dream... so once again, tech would always be a mean across any social/cultural/political/psychological vehicles.


It's down to cost benefit. If life has taught me anything is to never be an early adopter, unless you absolutely have to. It applies to many things but specially in tech: never buy the first version of a device (in fact, you'd be spending more money for a worse experience), never use the .0 major version of something important enough, never use the beta version of something unless you must do (e.g. you're a dev and you're preparing your apps for the next version), never learn a language or framework that just came out because there's 100% guarantee things will change a lot and it may as well die within a year, etc. However, I do learn new stuff all the time but they're things that have been somewhat established and solve problems I have.

There's also the thing that one may be depressed for one reason or another and then feels like doing nothing new anymore and just want to play with the known - which is a quite natural reaction, but shouldn't be confused with getting old.


The truth is, producing new stuff follows the law of diminishing returns in every category. After the point where the diminishing returns is really noticeable, new inventions and research actually becomes harmful and pathological. That includes many academic research fields and most technology today.

In fact, if you add up all the breakthroughs we've accomplished, most of it is just helping us use resources more efficiently so that we use more of them, eating them all up like a virus.

We should have stopped most research and technological invention long ago, and instead focused on sustainability and trying to reach an equilibrium with our natural environment.


You've put into words perfectly my thoughts on it all too. Sustainability, self awareness, equilibrium - these are foundations of strength and resiliency over the long term.

Technology went from empowerment to entrapment


> harmful and pathological

Don't forget "toxic" too !


Interesting that the Reddit post is about web frameworks and stacks, as the title made me assume this was purely about devices/hardware.

Maybe I was projecting my own feelings onto the topic, which are mostly about hardware. As I’ve gotten older, I find it much harder to care about new PC components, cutting edge graphics for video games, and processor cores. I just value stability and consistency above all else


At least in part, I don't have the time or energy to care anymore. In my 20s I loved reading long form articles on AnandTech about the latest chip or motherboard. Nowadays when the kids are in bed at 9:30 or later and I finally have a moment of downtime, I just want to watch a tv show with my wife and go to sleep.

I still think new hardware is neat but I don't have the mental bandwidth to follow it closely and in depth at this stage.


I'm honestly not sure it's as much you getting older as its games and hardware getting older and with that change is becoming more incremental.

When you got a new graphics card and game generation in 2005 the leaps were pretty wild, nowadays you can probably skip a generation or two of top hardware and you can still run things fine. After they put out the Dead Space remake I was interested to see how the old game from 2008 played, and even that still plays and looks modern. That's 15 years now, go back 15 years from 2008 and it's very different.


That’s very true as well, tech is more iterative now. I also think just having less time due to being a parent now also wipes out a lot of interest or motivation to check out new stuff


I had the same feeling reading the title, I guess you are right about projecting. Just this morning I realized my old thinkpad is about to throw the towel, thinking about going to find a decent replacement in the current market depresses me.


I am still excited about hardware but progress has slowed down to a crawl, especially on the GPU side.

Can't blame AMD and nvidia for trying to make a profit, though it sucks to be a consumer because what we do get right now is severely overpriced junk that couldn't be sold to the big enterprise bois. Literal scraps.


For another perspective, although I did buy an Apple Silicon MacBook, I do a lot of my computing on almost 10 year old Macs and mostly care in a theoretical sense about new CPUs and GPUs. Most gear that I have is plenty fast for anything I do.


I used to work 5 days a week, then I was very much in a “I just want my shit to work and do my job” mentality. I had no energy to learn and experiment outside of work. Now that I work 4 days a week, I have found my passion again. I’ve gone back to emacs, started a blog, learning rust, started self hosting and I am having tons of fun in doing it. I haven’t been this passionate since high school.


As I get older, I realize that learning new stuff all the time is a mugs game. 90% of shiny new things will be irrelevant in a few years.

So I'm fairly cautious about what I bother to learn. I'll grudgingly learn new stuff if my work requires it, but mostly I coast on my decision to learn python 15 years ago.


Right but even staying within Python there is plenty to learn and lots of evolution and new stuff coming up over time.

I think that you just realize eventually that you have to be more selective with your attention. And it's often not hard to use up plenty of time keeping up with whatever domain you are already in.


Right but even staying within Python there is plenty to learn and lots of evolution and new stuff coming up over time.

By the time that’s a problem, there’s a good chance OP will be managing people or in a different field.


> python

Typing? Get off my lawn.


Having grown up in 80s/90s, I also start getting older.

While I don't care for technology like "do X with an app" where I know X can also be done without an app, I do get excited for actual real new tech with substance. LLM's, self driving cars once they truly work, much higher video resolutions than the past, etc...


As I have gotten older I stopped developing for the web entirely. There's a certain simplicity the UNIX philosophy gave us that is sorely missed. I've started to take the Suckless philosophy more seriously when more than a decade ago I used to mock them for being curmudgeons. There's no glamor in sucking less. No youtube videos or made-for-conference code. But, there's a beauty about it that is hard to describe.


What is Suckless philosophy? Gravitating around simplicity more or less?


http://suckless.org/philosophy/

More or less. It's the "Worse is Better" philosophy. Instead of adding features to make user experience better make software instead that is good at one thing. It emphasizes minimalism and frugality. You may know the Suckless group from `dwm`.


This is the best way to look at it. I feel similarly, and I’ve been thinking for a while now that the one technology that is actually worth embracing or paying for that I have used is adaptive cruise control. It’s close enough to self driving for me that I consider it self driving.


One thing that I hear a lot in the tech field is how exciting it is to work in tech since "I'm always learning new things." This doesn't excite me very much since many of the new things I'm learning are not useful outside of a narrow slice of time within that specific tech job. They're not broadly applicable to anything, and are wholly specific to a software product as a service which will be fairly different, or may not even exist in a few years. (even if it does exist, my company will have moved on to some different product by then.) For example, learning regex could be useful because it might be used broadly in many different products. Learning how a specific product handles certain data fields is really not exciting at all. It's wasted space in my mind, except of course for the paycheck.


Nailed my problem with "learning new things". I especially feel this way about "cloud" - you aren't learning some new technology, you're learning some company's (cough... Amazon... cough...) proprietary offering which is designed to lock you in and wring every last penny out of you. Not that it isn't useful, obviously it is, but I just don't get the same joy out of learning about it that I do from playing with SBCs or other tech like I used to.

I suppose that is why something like Kubernetes is useful (I haven't touched it; I'm one of those weirdos who uses Swarm); it allows you to abstract away a lot of the underlying provider and create your own overlay environment which does bring back a sense of ownership for me at least.


Content didn’t really follow from the title. It seems like OP wants to minimize unnecessary effort when working towards their goals. I notice it myself in someone who is mid career. I manage the workstation software environment for a group of users, and some of the software versions are a couple of years old because they do the job at hand, everyone knows the UI, etc.

I had an intern over the summer and he “upgraded” a station to the latest of everything, modified the terminal profile with mods, tricked out VIM instead of our pre-existing IDE, and so on. I think maybe one of these changes were actually useful, and the rest broke some dependency in our setup that he had to go fix, or I had to get him to revert when someone else needed to use the station.

We also have a new hire who keeps using the word “Dockerize”, which makes me wary


I fail to see how one could get excited about Next.js, young or old. It makes so many poor design decisions for the type of tool it presents itself as. It is my understanding that the original intent of the project differed, so I can understand that the design choices may have been sensible in that context, but are a terrible fit for the change in project scope and, indeed, how people are trying to use it today.

I'm old. I still get excited about technology that is worth getting excited about. But Next.js brings nothing to the table.


The original intent of Next.js holds true today: https://vercel.com/blog/next


Corollary A: As I get older, it is seriously hard to stay interested in video games: more and more, it feels like playing with little toy cars. I have more and more a hard time accepting the "lore" which pretends to be "serious"... which is actually clearly made for teens/young adults, and often lack coherence even at following the "rules" of their universe or abuse in a cheap way the deus ex machina.

Basically, I need something beyond that, which gives me a strong emotional response and/or a feeling of social interaction. I found some very "art"-y games which give me such emotional response, and some online multiplayer games (for instance dota2... until they don't hide AI based bots... because sometimes...), or "strong community based" solo games (for instance celeste)

Corollary B: As I get older, I am aware that some nasty human beings are using technology to "take" over big critical parts of our every day life: that's why I am adamant on interop between Big Tech and Small Tech. Tech must stay in user control and not be used to control the user for "others". For instance, why most if not all administration services should be provided to noscript/basic (x)html browsers (as they were before Big Tech brain washed/corrupted administration tech executives with their grotesque and absurd web engines).


I feel the same way about games— especially AAA titles. I can’t be bothered. Some games that you might consider for giving you a unique feeling without any pretentious nonsense: Limbo, Inside, Journey, Planet of Lana.


Limbo, Journey... yep

Heard about Inside, and lurking on Planet of Lana (but I need to validate it runs well in a lean wine+vkd3d build before #noproton).

(I am waiting for some of them since I played their demo, don't recall all the names)


Games just don't respect your time any more. Usually 80+ hours to experience an open world game with a handful of side quests.

While I don't share the same love of "emotional experience" games, at this point I'm thinking about only sticking to mission-based and "wide linear" games so that I leave plenty of time for other pursuits in life.


This sounds more like a complaint about day-to-day work as a developer and less about technology in general. I can certainly relate, though.

I personally believe that one of the biggest lies about professional life is that of "working with your passion," because it creates unrealistic expectations. Plenty of people can be passionate about programming, but precious few will have the luxury of being passionate - over time - about working as software developers.


As I get older, I just care about boring technology.

I want things to work without me having to faff around or spend too long configuring them. This applies to both software and hardware.

The only tool I still use with non-trivial config is my text editor, but that's going to be pared down over time.

Switching from zsh to fish a few years ago left me with a nearly identical shell with a 20 line config instead of hundreds. I'm trying to use this approach wherever possible.


>As I get older, I just care about boring technology. I want things to work without me having to faff around or spend too long configuring them. This applies to both software and hardware.

There's something intellectually alluring about new technology or new shiny in general that I think appeals to your general technologist, it's a trait the draws such people to various fields. I know when I was young I drank the Kool-aid every turn.

As I got a little older, and after I was burned several times adopting shiny things either wasting my time or never seeing them widely adopted to a point of reasonable usability, I came to the very realization that everything we create, including technology, exists to serve human needs. It's stupidly obvious, it shouldn't need to be stated but I think it does. Law exists for humans, medicine exists for humans (even medicine for animals, it exists to serve altruism or empathy in humans) and so on. Technology is absolutely no different.

There's a few ways technology can serve humans. It can appeal to the general intellectual interest but ultimately, few if any pay money for that. People in general don't care about it, it's neat for a few minutes then gone. So ultimately what functional need does technology really serve? What does it enable me to do that I otherwise couldn't? How does it make my life easier?

If it doesn't do these things, I frankly don't care about it anymore. I have plenty of places to tap into serving my intellectual interest, places where the things I learn aren't as vanishingly ephemeral in nature and entirely artificial, be it learning something about physics of the world, medicine, whatever. There's well over a lifetimes worth of other mature and well established things to learn about you won't waste your time in like NewWebFramework with NewSyntacticSprinklings.

To me, this is why I similarly only care about new well crafted tech. Your thing needs to either help me do something I otherwise couldn't or make my life easier. A huge amount of new tech does neither, it pedals novelty and the ambiguity of some type of miniscule improvemen in quality of life, all while trying to make a buck.


After working in AI for a big chunk of my career, I have switched gears in retirement and am having a great time learning to program small controllers like Arduinos. It's like going back in time - minimal OS, simple tools and direct access to the hardware. Just for the heck of it, I am building a simple robot from scratch right now and its been great getting back to the basics.


It is always striking to me how many of my coworkers truly do not care about solving a problem or getting things done. If they aren't exploring some new database, language, tool, or whatever then they just lose interest and try to force one of those things into the codebase.

Oddly enough, I feel like I have some friends that are the same way in their hobbies. It's crazy to me how they'll buy a new board game for $80 and play it once, maybe twice and never look at it every again.

This constant novelty-chasing is probably what both groups are really after.


Basically what I wrote in a different thread:

I get idea that some parts of the front-end ecosystem are designed to be busy work, or at least usually turn out to be. But not all of us work in a SaaS development team that can affort 30% or 40% to just play with front-end build systems. Some of us work for clients instead, and need to make the best of the time we have. And more often than not, that means excluding risk factors like large parts of the NPM ecosystem. Limiting things to some simple gulp, dart-sass and some terser to “build” the front-end.


I have a similar but slightly different take on that

Instead of "As I get older, I just don't care about new technology" it's more "As I get older, I'm more skeptical of new technology"

When I was a young programmer I thought every new framework, language, innovation was amazing. These days I just look at things with a much more critical eye and a higher bar when asking the question "why does this exist?"

My general stance is that most things we are using are not ideal but are sufficient. However, most new things are not sufficiently better enough to replace what is technically sufficient enough.


As I get older technology seems to be getting worse. Sure my phone charges faster but it's also too tall and not wide enough and it keep using more screen area to show fewer settings and seems to be able to do less every time I get a new one.


The constant drive for 'more'. When the field was young and smartphones exploded you could easily make money, as well as provide something of big new or unique value - even if it was just entertainment.

Now that we're long past the peak of low hanging fruit of productivity and value, we arrive at the inevitable results of late stage capitalism - extracting the most value possible out of the end user through manipulation and coercion.

It's bloat, it's enshittification, entrapment into rampant rent seeking and it's the low key subversion that has been happening for years.


What I hate about new technology is that it's not really new.

When someone says X is the modern framework for doing this or that what he really means is that X is a cleaner re-implementation of W. But the only reason it's clean is because it doesn't support all the things W did.

If X is successful it will eventually add the features that will make it as messy as W was. At that point Y will come along which is the modern (trademark pending) way of doing it.


There's fundamental learning and there's superficial learning.

After a while you know the fundamentals. Data structures and algorithms, common architectures, known solutions to old problems.

But job adverts tend to go for superficial things. The js framework du jour, some language that anyone adjacent can learn in a month, some new face on the same data structures.


Android development is torturous for this. I've been getting emails from Google recently advising me that I need to update my minSdkVersion or my app will no longer be on the Play Store.

So I opened up Android Studio yesterday to upgrade things and hopefully release a quick beta to shake things down. It's now the next day and I'm still weeding through compile-time and runtime errors caused through the upgrade.

I feel like every time I update Android libraries my code bloats out with extra conditionals and permissions checks, and generally makes my code more brittle.

Maybe this is just the nature of building a secure platform which has such a broad reach. I can't claim to know, but it sure sours the developer experience.

As a simple, non-commercial public transport app that I wrote mostly for fun and learning, it's hard to be movitated to fix it.


While I understand the general concern of keeping Android app on latest SDK to keep it secure and periodically check, if it still works on recent Android version, I really hate when new policies are introduced. Had to resort to using ChatGPT to generate long privacy policy written in "legalese" to pass. My app updates were blocked until then. Not to mention the fact that my app works fully offline and does not collect any user data. Now users will have a long privacy policy to read that says nothing, at least policy bots are happy. Another concern is possible upcoming requirement to display developers phone number. If that ever happens, that day I'm removing my app from Play Store.


I have this weird split. I really like to follow CPU and GPU tech, and I enjoy reading about nerdy stuff like consensus algorithms and esoteric programming languages whatnot even though I'll probably never use them.

But "smart" phones and apps? Meh, as long as they work I couldn't care less. Never have.


True

And it's especially bad in the js world, where it really seems the inmates are running the asylum

(Oh but blah.js version 87 fixes that weird thing in version 86 which anyone with a modicum of experience could have predicted it would go bad) well then...

At the same time don't be the boring greybeard that can't evolve. Evolution is good, and not everything has to be the "tested and true" technology (that will make you last in the race - this should be your hint that the newer solution can be better)


I'm quite excited about technology. I'm not too excited about the sloppy seconds in technology. We seem to substitute performance for convience. For example the whole need for FSR in rendering to just to get playable performance shouldn't be the desired output. It seems like we keep settling for the half ass solution.

AAA Games coming onto desktops needing 4090 with DLSS just to be playable at 4k just feels wrong.


Desktop PC gaming may have spelled its own doom with these GPU prices. I have a 11900K, 4070 and 990 Pro but my next system may be one of those handhelds. I tried to get a fair price on a GPU for a very long time. I won’t be doing that again.

On needing a 4090, I think the problem is 4K. These games are getting pretty advanced, and 4K is a very large number of pixels. I like it for the additional horizontal real estate. But I don’t like it for gaming. I would stick to 1080 or 1440. 1440 on a 27 inch screen at the average desktop view in distance is a great all around solution. That said, my next monitor is probably going to be 4K. I will just be playing a lot of games in 1080P on it.


Mind if I ask what made you purchase one of the worst CPUs ever? That 11900 is in the same category as the pentium D 820


Because that perception is wrong. Based on poor judgement (often Youtube clickbait and low information Reddit), based on simply a lack of knowledge regarding CPU architecture.

I would not upgrade yet from this 11900K because it's the best homogeneous core design that Intel released for desktops. It includes AVX512, and is the only consumer CPU from Intel to have it. Intel engineers have stated that they are dedicated to furthering AVX512, and that it's removal is temporary. All of the errata from Skylake was finally fixed in Rocket Lake, which went on for generations until Rocket Lake.

Some people were so deadset against 11th gen that they recommended 10th gen because there were 10-core CPUs. Yet the ringbus design it inherited from Skylake was very strung out by that point with unresolvable issues like the well-known L0 parity error. There's many other reasons but I doubt you're interested in rehashing Rocket Lake as it sounds like you feel you have enough information to make up your mind.

And I certainly wouldn't use AMD due to even more bugs than Skylake and poor latency. They're improving, but it's not my choice.


I rotate parts every other year with a specific component in mind.

GPU / Disk / CPU + Memory so a build configuration rolls into another constantly maintained.


The volume of "revolutionary new techniques" is exceeds our capacity to master each+all of them. I wonder what a https://killedbygoogle.com/ version of https://landscape.cncf.io/ would look like.


Oh, a post about web development. Yeah, it's hard to imagine anyone getting excited about that. It's different when you do academic research. Learning about machine learning tools, new computational methods to solve nonlinear models, and that sort of thing gets more fun the older you get. The more you learn about them, the easier it is to learn, which reduces the pain.


> Trying to do this Next 12 -> Next 13 upgrade is not fun

Oh, the horror!

In all seriousness, I have been at this since the late 90s. 1996 specifically when HTML/JS/CSS all became a thing. Do I want to write another complex dynamic HTML form with validation? Not really, no. I really don't enjoy it.

Do I want to upgrade this current Angular application from 16 to 17 that I lead?

Yes, I actually do.

Because I know that will keep us from falling behind, from more tech debt, from an impossible upgrade path. Am I barking orders from on-high? No, I'm not. I'll actually do it myself, because I want to make sure we are are where we need to be.

And after all these years, obviously the front-end is only the least of it. The APIs, the database, the custom hardware, the 3rd party vendors ... the ...

Anyway, burnout is real, if you are feeling it, take a step back. But otherwise, I'd say with technology, you need to care about "new" otherwise you will fall behind.


Yeah this was my thought as well. If you’re paid to maintain software you have to actually do the work to keep up to date on its dependencies—that’s the job! We’ve seen the alternative play out countless times, GitHub forking Rails 2.3 and maintaining it for years and absorbing more and more tech debt for years until they bit the bullet and upgraded back to mainline Rails.

Sure I’m getting old and it sometimes feels tiresome that software is a treadmill, but hey, it’s job security! And more importantly, change opens space for improvement, which feels good even if it doesn’t always live up to the promise.


> Sure I’m getting old and it sometimes feels tiresome that software is a treadmill

Precisely.

Sometimes, I wonder if people are curious why after all these years am I apparently still working? Have I not become a "VP" or a "founder" or, better yet, a "VC"?

To each their own. I'll be ok.


Life is all about choices. Which tech to invest in is a hard question, for me the answer has always been 'as little as I can get away with' while at the same time studying the whole industry for opportunities and things that I believe will have long term staying power.


As a counter example, I love learning new languages and frameworks. Reading a “Getting Started” tutorial is as interesting to me as picking up a new book. That’s what I like about public cloud, with hundreds of services there is always something to learn.


I almost don't care about 'technology' anymore.

Rationally, I appreciate it. But it's best when it disappears and I get my time back to go outside and ride a bike, converse with humans, listen to music. It's a means to an end.


I find as I’ve gotten older I’m no less enthusiastic about new technology. But I’m much more reticent to assign any intrinsic worth to “new”.

Young me would have assumed “new” meant “better”. Now I’m absolutely certain that is not the case.


The post was about not wanting to spend time constantly upgrading.

Sometimes life is too short to upgrade all my apps to latest standard or framework, no matter how great the new things are. I hope they could be more backward compatible.


There’s plenty of old technology to explore!


This. I love getting stuck into new-to-me technologies that are tried, tested, and still around. Is this a good idea from a career perspective? I'm not sure. While I knew some SQL before, I'm deep diving now and I'm sure that'll be useful in snagging future jobs in the data space. But learning Lisp? The principles and perspectives I pick up will no doubt be applicable elsewhere, but perhaps not the language. But then again, not everything is about career progression — I'm really doing it for fun and on that front it's certainly delivering!


I'm feeling a bit of this with all the AI stuff lately.

I went all in on AI back in 2017 trying to build a chatbot for ecommerce from scratch and it went nowhere. Maybe a bit sour grapes on my end.


I never built any AI stuff though and I feel a bit the same, I watch people in my group screw around with it, nothing really special has happened except it has been talked about and hyped up endlessly. The CTO even sent a mail around saying to expect a 75% productivity boost within the next few weeks, it never happened...


I don't think this is really true. Its just that that the tech world has plateaued and been generally commoditized and boring. I've noticed over the last 10 years not much has changed. Tech used to have stuff you never saw or features that did not exist or solved some pain point you had.

Now next 13 adds new route handling and API fetch changes for its major release, its not exciting. (I don't use nextjs so forgive my generalization on the new features.)


New physical tech used to make a much bigger difference in the beginning of PCs... Now someone can use a 10 year old PC and be perfectly happy for most applications


Web “tech” is saturated, mainly by the loads upon loads of forks and frameworks that reinvent the wheel and somehow end up making everything slower to achieve.


I'm 59, and I'm selective about which new technologies I get excited about. One question I ask myself is, will I regret missing out on this or waiting for the next version?

I would have regretted missing out on microcomputers and programming.

I don't regret missing out on Windows Vista.

I get excited about technologies when I find a use for them. So far I haven't found a use for LLM's, but that could change, and there will always be the next version.


I'm 52 and I care about new technology. If it looks like it improves things.

Most of the time that isn't the case and it's simply more useless abstraction and time wasting and marketing and resume building and scamming and ego very clever see!.

But it is worth looking into things briefly if they pass the initial 2 second smell test in my opinion. There is the occasional diamond in the dung heap that passes for modern technology.


I wonder how much of this is due to innovation happening "under the hood" rather than on the surface.

As an example: the newest model of iPhone looks 99% the same as the original one that came out 15-20 years ago. The iPhone 5 years from now will probably look the same.

Compare that to the difference between cars in 1960 and 1980, or telephones in 1980 and 2000. The external form is vastly different.


The AMC series Halt And Catch Fire had the line: “Computers aren’t the thing. They’re the thing that gets us to the thing.”


Can I add that being wary of the devs who love all the shiny new things is kind of a tag along of the original feel.


Technology was more interesting about 20 years ago. All different phone models, with constant feature improvements, different form factors, so many different things to choose from. Now we have a boring black rectangle running one of two mobile operating systems, not much to care about here.


"When phones were fun" is a, well, fun series about cellphones of yesteryear.

https://m.youtube.com/playlist?list=PLwd8abTO4vh2smuMzykXDOP...


I don't know about the rest of the world, but I keep losing interest in technology because it's rarely meant to be useful anymore. More and more technology seeks these goals:

1. control

2. vanity

3. ego

4. control

5. megalomania

6. compulsive urge to radically change everything you ever thought you knew about ____________

7. control

I haven't been the target audience of technology for a long time.


Me too have developed an aversion to new frameworks, but mostly because the old ones become outdated and go out of support so fast. Feels like we're forced to keep updating and switching tools without any real gain, just to stay compatible with the infrastructure.


There is no new tech in software. Just rehashings and rebranding of old stuff. Different flavors but the same icecream.

So pick a flavor you like and stick with it.


When you're a kid, everything is magical and the world is paradise.

When you're an adult, everything is set in stone and the world is out of touch.

When you're an elder, nothing is as it was anymore and what world is this?


“I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.”

― Douglas Adams, The Salmon of Doubt


Correlary: Technology that existed when you were a child is considered normal. Technology introduced when you are a teen or 20 something is exciting. New technology after 40 is scary.


New tech after 40 is boring. 40 is young. New tech after 60 can be scary, it doesn't have to be but you are going to have to relearn a lot of stuff and that gets harder with age.

The most annoying is the tech that is forced upon you.


That’s an interesting way to look at it. For a certain generation like me though, I’m over 40 but under 50, all the new technology coming out is trivial. Appliances. The desktop PC era from the Commodore to DOS and Linux and even Windows with a complex set up, is far more challenging than anything coming out today.

I think there is a generation of technologists that are the prime era. Not too old to have lived the simple life. And not too young to be living the simple life of the iPad. At least in regards to technology.

The modern life is very challenging if you’re doing it right. There is a lot to research and be responsible for unless you just run around signing your life away on paperwork without properly educating yourself about mortgages, real estate, housing. That on top of learning a trade is pretty tough. For the average person at least. And it takes a lot of effort out of everyone. Most people just sluff off though and settle for less. Or blame the system for their condition, since no one is making $50 an hour out of high school anymore, like how it was for decades. One of those two.


And it'll happen to you...


This thread is distressing. I come here to keep tabs on what the yoots are thinking and it reads like I’m in an IHOP next to a retirement community.

Where do the edgier Waffle House techies hang out today?


As I get older I'm still finding things I care about but they're more niche than flavour of the month JS framework

I'm learning Clojure Electric at the moment and it's wild


if you look at the sys admin side, im only excited about long old stable software, not the new stuff. but for technology as in ebikes and things that make my life better, im all for it. A good one was the wii, if i was in a nursing home or an old man and you showed me that thing i would be super excited. Ebikes allow old people to ride bikes again. So two sides of it, improving my life yes, managing systems - keep it stable for me please.


What tech matters to you and what tech should you care about? If this were an Eisenhower matrix, this would be the urgent-important quadrant.


man i went through the same thing. when Swift was introduced by Apple back in 2014(?) i pushed for its adoption as a senior engineer. when Apple introduced SwiftUI, I was like lets wait 2-3yrs and see how it goes before we adopt it and i'm glad everyone in my team agreed with me.


Only tech I've been excited for in the last ten years is generative AI


> As I get older ...

Top comment on the reddit post:

> Yeah. 20 year dev here.

Oh, the humanity.


Author probably meant 20 years of experience. That's quite a long time to stay in an industry that values the wet-behind-the-ears-excited-to-be-exploited developer. I've long supposed the "ageism" in the industry is largely people just getting tired of keeping up with the coolest new toy. I've figured that if I lose my job I will probably need to leave the industry because I just can't be bothered to learn the Next Greatest Thing that Google is shilling.


I don’t understand. Somebody who has been developing for 20 years is likely in their late 30s or early 40s at least. Did you read that as the poster claiming they were 20 years old?


I did. Silly, silly me.


Don’t use frameworks. Problem solved.


That’s what version locking is for


que Confucius change quotes


The tragedy is - why do the young care about technology? Look like they have to find some meaning in their un-fulfilling lives.


Technology is a false god. Speaking of which, I spent about 15 years of my life being an atheist. I went to Mexico last Christmas and came back a changed man. I answered the calling of my conscience. I think over time the current political situation in the US is going to push people back to traditional values as well. Living life for yourself is always going to be ultimately unfulfilling. Men need to sacrifice and be responsible for others. And submitting our own will to the will of our creator is very liberating.

And if anyone from Mexico sees this, God bless Mexico. I’m thankful for Catholic nations. The level of devotion and permeance is awe inspiring. I have a new admiration and understanding now of the Catholic world. It almost feels like home in a way.


Religiosity is rapidly going down over time and it's unlikely that it's going to suddenly turn around.

https://www.pewresearch.org/religion/2022/09/13/how-u-s-reli...


Everyone knows that. I’m not sure why you would even bother pointing it out when it’s common knowledge. Unless it’s news for you because you were very young. But the fact that one person says there’s no turnaround coming, and the next person, me, says there will be because of a degenerate society in decline means nothing. Believe in salvation is a personal choice that everyone gets to make. We are compelled to confess our belief, but ultimately everyone has free will for their own choice. I’m basing my view off of historical trends during periods of societal decline. There will be a point in time when there will be no Christian revival. That’s the prophecy of Revelation. People are, and will start looking for better answers. There is one answer that has not only served us well, but it’s also true. Unless someone is strictly materialistic and doesn’t believe there is any supernatural realm whatsoever, the question has to be asked. Is Jesus who He says He is. If people seek the answers, I believe they will come to the same one that so many have before us. That Jesus absolutely is who He says He is, our creator, God.


We don't even have a name for it now, that's how bad it is IMO. All those mere vessels relaying what "the average user" or "the industry" requires, which we should most humbly submit to. The way "disrupt" was thrown around, or the kool-aid required to believe that continuing down the route consolidation and inequality will somehow result in universal basic income. People who fly to 15 conferences a year rolling their eyes at anyone refusing to understand nuclear energy is the way to solve climate change.


Did you mean to reply to someone else?


I’m probably guilty of idolizing tech at times too. Others here have echoed the importance of using tech as a means to an end, not as an end unto itself. Thanks for sharing your journey.


[flagged]


Please don't do this here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: