Hacker News new | past | comments | ask | show | jobs | submit login
My 20 year career is technical debt or deprecated (visionarycto.com)
579 points by spo81rty on May 15, 2023 | hide | past | favorite | 573 comments



> My entire career is now technical debt, or the code has been deprecated.

My fellow dev often laugh when I tell them that instead of looking at all the long dead techs that are not useful to me anymore, my way to feel good is to look back at all the long dead techs that I didn't bother to learn.

And, geez, is the graveyard huge.

> Java Applets were also a big thing once upon a time. They were slow, and having the correct version of Java installed on your computer was always a mess.

Java applets were never that big. They didn't work very well (for the reason you mention) and weren't ubiquitous. They also nearly all looked like shit.

But Java isn't disappearing anytime soon. Java is huge and it'll have a legacy dwarfing COBOL big big times. Many may not find Java sexy but the JVM is one heck of a serious piece of tech, with amazing tooling available.

And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

Did anyone in the mid to late nineties / early 2000s really discover Java and Java applets and thought: "Java applets is the tech that'll catch on, I'll invest in that" and not invest in Java itself? To me Java was the obvious winner (not that it was that great but it was clear to me Sun was on to something). And, well, compared to the other dead tech, at least if you learned Java applets you got to learn Java too so it's not all lost.


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.

I think the problem is cultural, not technological. It seems like there's a lot of people in the Java community who still think OO is a great idea. Who think Bob Martin's awful, muddled code examples in Clean Code are something to aspire towards. People who claim the path of enlightenment requires them to replace concrete classes with interfaces - even when those interfaces only have 1 implementation.

Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.

This is what makes IntelliJ so impressive. It takes a master craftsman to forge something beautiful from such an unholy material. VS Code pulls off the same feat on top of election. In each case I can't tell whether to be horrified or impressed! Either way, I'm a huge fan.


Does anybody know where I can find a good, substantiated, critique of the common Java coding patterns, including on Android? This niche is such a huge mess that simply documenting all the bad things in a single codebase (along with explanation why they're bad) took me a month. It's tiring and mentally draining to do this: every other line of code you read makes you go "Why. Please, just tell me why anybody could ever think good code should look like this". Worse yet, there's hardly a discussion to be found about these things - it's "Uncle Bob" & co and the horde of their followers all the way, no dissenters. It looks like an echo chamber so hermetic that the most basic principles like DRY or YAGNI have a hard time penetrating it.

What is the most painful for me with all this, other than it being 99% self-inflicted and not caused (you could argue it's encouraged by it, but the ultimate cause is the culture) by the language, is the fact that it has infected Kotlin code. Kotlin was built to increase expressive power of the language, doing away with multiple limitations of Java and offering lots of modern-ish features on top of it. The community looks split in half: the Kotlin community tries to get the most out of Kotlin, and the Android community that does anything in their power to make Kotlin back into Java, writing code as if the limitations were still in place and new features didn't exist. I know that the churn and "production readiness" of things on Android generally favors a more conservative approach, but it's still too much. I cry tears of blood every other code review I'm forced to do.

If there's a single, definitive resource that I could point my coworkers to and eventually turn it into enforced guidelines, the author can count on a serious donation from me.


Hah. I’ve spent several years writing javascript for a living. You can always tell when code was written by someone who’s arrived fresh from Java or C++. Their code is full of hundreds of lines of useless classes which can often be replaced by a few simple object literals. Unlike class instances, object literals can be easily json stringified and parsed, too! You can torture people who are like this in code review: “This isn’t idiomatic. Please rewrite this code without the class keyword”. I’ve seen people people make a face like I just had their child expelled from kindergarten.

I don’t know any good resources unfortunately. I feel like we need a “motherfuckingwebsite” equivalent for this - “just use a motherfucking function”. I want to link it to whoever insisted on adding a useless TextDecoder class in javascript that you have to instantiate, instead of just calling textDecode(…, “utf8”) using a global function like the rest of the standard library.

I think part of the problem is that most people who hate enterprise java just learn a different language, set their resume on fire and start over somewhere better. That’s certainly what I did. I’m writing rust at the moment, and thankfully the lack of classes and distance from the JVM seems to keep most of this nonsense out. But having all the doubters leave makes the problem within the java ecosystem worse.


> I want to link it to whoever insisted on adding a useless TextDecoder class in javascript that you have to instantiate, instead of just calling textDecode(…, “utf8”) using a global function like the rest of the standard library.

I for one would rather punch the person who proposes such a global function as the only mechanism for conversion, because charset conversion is a reasonable thing to do on chunked partial inputs, and maintaining the state for conversion yourself is actually quite painful. Wrapping a stream converter into a one-shot function is much easier than the reverse, wrapping a one-shot function in a stream converter.


You can have the global function and make it work on "chunked partial inputs" while maintaining the state between calls (that state can be hidden in a closure or made explicit as an argument or a receiver - functions in JS can be called on objects even if they were defined outside of them). The very bad example of this is C's `strtok`, but it's pretty typical for JS to encapsulate the state in a closure to get similar functionality.

Said another way: the functional, simplified interface doesn't mean you have to get simplified or lacking functionality. Haskell wouldn't exist if that was the case. The simplified interface providing as much functionality as the more complex interface is possible because the expressive power of JS is leagues above Java - porting Java patterns that emerged due to Java's shortcomings (some call them "design decisions", and they're also right) to JavaScript is simply not a good use of JS as a language.


This is what GP is talking about. Chunking can be done cleanly with very few internal functions (and without changing the call signature, if you want), but you're implying it must be a class and already thinking about hypothetical hacks


> I for one would rather punch the person who proposes such a global function as the only mechanism for conversion, because charset conversion is a reasonable thing to do on chunked partial inputs

Fight me.

Javascript has a separate TextDecoderStream class if you want to support chunked, partial inputs. The TextDecoder class doesn't support streaming input at all. And it never will, thanks to the existence of TextDecoderStream.

TextDecoder only provides 1 method - TextDecoder.decode(). And the method is pure! So you could just make it a global method without any loss of functionality. The entire class has no mutable state whatsoever. Its just a container for some options, passed into the constructor. Those options could just have been passed straight to a global decode() method.

https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder

This might be idiomatic C++, but its terrible javascript. I had a friend who worked on the Chrome team years ago. He said a lot of the browser engineers who design javascript APIs are C++ programmers who don't know javascript properly. So of course they port C++ ideas into javascript directly. They don't know any better. This is why javascript's standard library is an incoherent mess.


I see a benefit from having this options container: you can have a central place to set the options and then only pass down the decoder and the user of the function doesn't have to bother with the configuration


You can always pass a closure.


Or pass an options object.

Regardless, it’s a strictly worse api in the general case. And it’s on par with passing around an options object in the case you want to share the same options in multiple places.

If the current api is either the same or worse compared to a pure function version in all cases, I’d prefer the pure function version thanks.


An Options object is a worse API, since it assumes an implicit dependency.

> If the current api is [...] the same [...] compared to a pure function version in all cases, I’d prefer the pure function version thanks.

Why?


> An Options object is a worse API, since it assumes an implicit dependency.

No, it would be an explicit dependency.

    // definition:
    function decodeText(input, [format | options])

    // use:
    decodeText(someBuffer, 'utf8')

    // or:
    const opts = {encoding: 'utf8', fatal: true, ignoreBOM: false}
    decodeText(someBuffer, opts)
    decodeText(somethingElse, opts)
> Why?

Because a pure function is idiomatic, concise and clear. Its easier to use, easier to document and easier to understand. The only benefit of a class is that it encapsulates its state behind an interface. That makes sense for a BTree or a database. But TextDecoder has no mutable state. It has no state at all except for its passed-in configuration.

Decoding text is a verb not a noun. In programming, verbs are translated to functions, not classes. You don't have an "adder object" to add things. You don't have to instantiate a JSON stringifier before calling JSON.stringify. If thats not obvious, you may have spent too long in OO languages.


Well, that's basically the same


It wouldn't need to be a global, it could be mounted on String for example... And one doesn't necessarily prevent the other.

    String.fromUint8Array: (input: Uint8Array, encoding: string): String;
Would be a fine, simple addition where it probably should be. Maybe calling the default for UInt8Array "String.fromBinary" instead.


> “This isn’t idiomatic. Please rewrite this code without the class keyword”. I’ve seen people people make a face like I just had their child expelled from kindergarten.

Lol true

Don't forget all the getters and setters merely updating/reading a variable

Python says "explicit is better than implicit" but Java goes too far with it, and in the most verbose/inflexible ways possible


> Don't forget all the getters and setters merely updating/reading a variable

> and in the most verbose/inflexible ways possible

It's actually extra flexibility meant for two things: being able to override the getter/setter in a subclass, and keeping a consistent interface so users don't need to change how it's called if there was a refactor that adds something to the getter/setter (such as transforming the value because a different representation was more useful internally; particularly useful for libraries).

Python has @property to maintain that interface if need be, but these Java conventions started when no such thing existed in the language. I haven't done Java in a long time, so I don't know if it has it even now..


> It's actually extra flexibility meant for two things: being able to override the getter/setter in a subclass, and keeping a consistent interface so users don't need to change how it's called if there was a refactor that adds something to the getter/setter

This always strikes me as any-benefit mentality thinking. I agree there is some small marginal benefit to this pattern, but the cost (in time, decreased readability and lines of code) is massive. The benefit of being able to change your getters and setters later in a public interface almost never actually shows up.

Most getters and setters aren’t even part of a public interface anyway - because either they’re private or they’re in application code. In both of these cases, you can delay replacing a public class field with public getters and setters until you actually need to. When it actually provides value, it’ll take all of 5 minutes to do the refactor. Intellij can probably do it instantly. And, Spoilers: this will almost never come up in practice. Public fields are almost always fine.


It was originally for libraries that were distributed as jar or class files without the original source, that crept into general "best practices". Also IntelliJ didn't even exist in the 90s when this started.


It exist now. And even before intellij and eclipse had automated refactoring tools, it was like a 5 minute refactor. Just change your public field to be private, add getters and setters then play whack-a-mole mechanically fixing all the compiler errors that showed up.

I can see the argument for putting them in APIs exposed in jar or class files without the source. But the tasteless trend of adding getters and setters everywhere just looks to me like cargo culting. Its sheep programmers leading other sheep. You can tell its cargo culting because if you questioned anyone about the practice they would always ultimately justify their actions by saying "oh, I just do it because everyone else does it".

I believe its the responsibility of every engineer to decide for themselves what they think beautiful code should look like. You get some pointless arguments, sure, but the alternative is always a mess.


I agree with the cargo-culting, but the person I originally replied to seemed to think there was never any point, and that's what I was replying to - there was a reason it started.


> these Java conventions started when no such thing existed in the language.

Is there language support for these in the newer Java versions (I'm not up to date with newer features, since I won't be able to use them on Android anyway)? The reason for these getters/setters is as you said: a workaround for the language deficiencies. It's true for quite a few patterns, and it's not unique to Java; you get similar (in nature) patterns emerging in all languages. Greenspun's tenth rule and all that.

What's problematic is porting these workarounds wholesale to languages that don't have the limitations that originally led to their creation. In Kotlin, for example, every property has an implicit getter and setter, by default - you can override either easily with a dedicated syntax. In that case, insisting on writing explicit methods for getters and setters is simply a misuse of the language. Same in Python, as you note, where you can replace direct access to object attribute with a property without changing the user-facing interface of a class. I think JS also developed a feature like this? It's kind of impressive the OO languages managed to get this so wrong for so long, even though Smalltalk got it right in the 70s...


I think it's strange that people complain about the verbosity and boilerplate getters and setters in Java when this is entirely a non problem, provided your code is well designed.

If your class has any setter function, you're doing OO wrong. Mutating an object should 1) only happen if you have a very good, inescapable reason; 2) never be exposed directly to code outside the class, including children. If your class must have a mutating function, it should be a high level operation, not "set". If it really is "set" then that implies the field being set isn't a part of that object in any real sense.

A well designed class might have a couple of getters, but the inclusion of getters is a deliberate decision to allow client code to see the internal state.

In other words, blame the IDEs for the idea of auto-generating getters and setters. The language itself did a decent job of protecting class state.


OOP is routinely used with stateful, mutating objects; it has been hailed as a good way to manage that paradigm. If mutation is bad, so most objects don't mutate, you're talking about a functional niche in OOP, not mainstream OOP.

A class may need a setter function for some boring, pragmatic reason like, say:

- the language doesn't have keyword parmeters: constructors have only positional parameters

- the class has a large number of properties.

- most users set only a small subset of the properties, defaulting the rest. (And you can't easily predict which subsets are popular enough to get their own constructor variants.)

In that situation you might want to just construct the object in two steps: default everything and set a few selected properties (and then don't touch them). It's de facto immutable, just not around construction time.


I almost agree with you, I don't agree with the gatekeeping/nitpicking of saying "you're doing it wrong"

But I do agree that in most cases you don't need to call individual setters. And especially not automatically create one for every variable in your class


No, it's not extra functionality, it's a crutch

> being able to override the getter/setter in a subclass

I think even C++ can do this without an explicitly named getter/setter

> and keeping a consistent interface so users don't need to change how it's called

Just use a better language

> Python has @property to maintain that interface if need be

Exactly. Java is unjustifiably limited in this regard


When you are coding a dynamic library, and providing consistent ABI between versions is a thing, you could appreciate this.

Perhaps you may also appreciate PImpl/D_PTR etc.


Word! I just left a Java-only shop for pythonic pastures and the culture is so much more pragmatic and to-the-point. Hopefully soon enough ML models can be fed millions of line of code and produce the functionally equivalent thousands...


They're trained on billions of lines of code - most of them are not very good. I'm using Copilot, and the docstring/docs it suggests are so bad it hurts. If left alone, Copilot would happily generate those thousands of lines instead of helping reduce them to hundreds. It's still useful if given enough direction, but you need to be really careful not to overuse it or risk getting mistaken for a junior straight out of a bootcamp during code review :)


Yeah... I've noticed about half of what it comes up with needs tweaking... I really enjoyed it for SQL schema writing though... especially many-many table creation.


> You can always tell when code was written by someone who’s arrived fresh from Java or C++.

AS3 and Flex was practically an attempt by these people to take over the language. Thankfully that failed.

TS is great by comparison!


Not a very recent one, but this "Kingdom of nouns" resonates deeply http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom...


The problem with Uncle Bub is that his views only work in sync within very limited situations such as hobby coding and small projects by a handful of developers who are starting on the same page. A lot about Uncle Bub's teachings and FrAgile isn't practical in most of the real world besides a minority of outlier stories (and the problem with those stories is they don't track whether practices continue working long term).

Take his views on "clean code" for instance. Or really any conception of "clean code." Clean code is bullshit. I guarantee you can take anything Uncle Bub or other notable programmers say about clean code, apply them to a tee at your job, and be told that your code isn't clean or "feels icky" by whomever joined the team before you did. No description of clean code that I've read has ever been truly helpful in my career. The only thing you can really do is decide what you think is clean code and for you and your team to reach a level of agreed disagreement so that everyone can get their job done. One person's descriptive variable name is another person's "that's too long I can't read with all these confusing names", and one person's set of short purposeful functions is another person's "I can't tell what's happening cuz I have to jump between all these functions." At the end of the day, you barely have time to write "clean code", because your boss wants features rolled out ASAP.

Uncle Bub is also one of those guys who thinks good code doesn't need comments because it's self descriptive. This is one of the worst ideas to ever have met the software industry. No one's code is self descriptive. It's all a bunch of gobbledygook because it's meant to be run by a computer and just understandable enough for a human. It wouldn't kill us to just write some documenting comments detailing the intention behind code, but sadly most programmers either are too lazy or believe that if they need to add comments then that necessarily means their code smells. The result is that nobody knows anything about any given software project except for those who have been on the project the longest, and even they often don't know because... surprise... nobody wrote anything down! Just like with "clean code", it should be left up to teams how they want to add comments to code, and how much you comment your code shouldn't be influenced by memes from other programmers.

Don't even get me started on FrAgile. It's just a way to dupe programmers into taking on more work and doing the job of middle management for them.


If you only write comments about intention, your code seems self descriptive, doesn't it?

Why don't you write comments describing what the code does?

> and even they often don't know because... surprise... nobody wrote anything down!

Well, read the code


> If you only write comments about intention, your code seems self descriptive, doesn't it?

And who mentioned writing comments only about intention?

> Why don't you write comments describing what the code does?

Yes, why don't we?

> Well, read the code

Because code is never long and vague, and human language can never sum up complex ideas or code?


You wrote:

> It wouldn't kill us to just write some documenting comments detailing the intention behind code

Intention behind the code != Describe the code

> Because code is never long and vague, and human language can never sum up complex ideas or code?

No, code isn't vague, and hood code mostly it isn't long.

Variable names are human language btw. A programming language is a human language.


> Intention behind the code != Describe the code

Where are you getting this exclusivity from? Given your use of an equality operator, maybe you're thinking in code a bit too much.

> No, code isn't vague, and [good] code mostly it isn't long.

This isn't even remotely true. There are many dimensions to code that a human might subjectively use to determine whether code is "good." Code that is "clean" can often be a performance nightmare, but code that someone subjectively claims to be "unreadable" can be more performant, more fault resistant, future proof, and so forth. In the context of human interpretation, code can be vague regardless of how "good" that code seems to someone.

Also, the mere fact that anyone can disagree with you that good code "mostly" isn't long discredits the very idea in an objective sense. Plenty of programmers don't care whether code is "long" if it's written procedurally and/or with pure functions. If you haven't heard such opinions before, then you need to meet more programmers of varying disciplines.

> Variable names are human language btw. A programming language is a human language.

A programming language is for the benefit of both the human and the machine, though it's still mostly to the benefit of the machine. If it were solely a human language, then it would be closer if not identical to a language like English. And, if it were, it would be tremendously slow relative to traditional programming languages, and even generate more waste heat.


Mostly agree: the biggest factor in determining "readability" is simply familiarity of the reader with the particular syntax or style of programming. Learning many different languages allowed me to experience the evolution of code from "how do I even read this" to "well, it's clear and obvious what's going on" - without it changing one bit in the meantime. There's nothing you couldn't learn with enough effort, and the differences between learning time needed to get to mastery are in my experience small, save for a few outliers (IME: J, Forth)

At the same time I believe that there are a few objective metrics that seem to correlate with long term maintainability of a codebase. For example, the more contextual, relevant, and correct(!) information it contains, the easier it is to work with the code, especially once the original authors depart (and they will, sooner or later). Capturing that information and putting it in the code - in whatever way, including comments, diagrams (ascii art and graphical), particular tests and doctests, explicit pre and postconditions, other assertions, log calls - lowers the effort needed to maintain said code.

If the additional information threatens to obscure the the view on what's actually happening, you can refactor it the same way you'd refactor code. If you can introduce helper functions to kick details out of the way, you can also add footnotes and links to files with additional docs to get the level of detail manageable in the most often read code, while still providing enough information about that code.

What do you think? I came to this conclusion based on my experience with learning a very diverse set of programming styles and languages, experience with maintaining long-lived projects at work, and the "Programmer's Brain" book. The book has its moments, though for the most part it's just boring, but it did provide me with a few puzzle pieces I needed to make some sense out of the whole thing.


Those Kotlin issues are common when a lot of engineers from one language switch to another at the same time.

We used to see the same thing with companies that had moved C engineers over to Java. Lots of weirdly overcomplicated C constructs. Meanwhile the newbs who only knew Java were writing FactoryFactoryFactoryImpls. :facepalm


I think a lot of it comes from the Spring framework. People saw those gigantic stack traces with all the crazy abstractions and huge names, and took it as the norm.

It doesn’t have to be that way (see also Guice).

But the Uncle Bob effect is sadly real.


Some genius ported the Spring framework to PHP (and called it Symfony) and we have to live with the BS from Java world in PHP land. Anemic models and lots of indirection. What a sad world we live in.


I did a lot of Java in the past, and ended up in PHP lately. Funny how PHP feels like 'I wanna be Java when I grow up', while Java (quarkus) says 'I dont wanna be Java anymore'.

Then again,Java itself has Spring, starting out as '4 classes for an EJB is insane architecture overload' and ended up in dynamic injection architecture astronaut land.


Isn't laravel based on symphony components?


The Java infestation is real.


Don’t bother. Your preferred style will go out of fashion just like the previous ones all have. Chasing this is low value busy work.


The enterprise java programming style is worth avoiding because it’s a productivity killer. This style obscures your business logic, it makes debugging harder through needless indirection, and it creates pointless busywork from the need to write, maintain and document reams of unnecessary boilerplate.

Foundationdb has official bindings in C, Python, go, Ruby and Java. The real bindings are in C, and all other languages’ bindings are well written, idiomatic wrappers around the same C library exposing the same functionality. The Java bindings need over twice as many lines of code as the Ruby and Python bindings to achieve the same thing.

Even if this style of Java is only 10% less productive than that of idiomatic Kotlin, Go or Python, you will probably break even on the investment of migrating languages after mere months. I think that undersells it. The productivity difference is probably much higher. Especially for large projects.

Improving your personal and teams long term productivity is just about the highest value work you can do.


All subjective. That style has plenty of people who don’t agree with any of your criticisms and could list many of what they consider benefits. It would also be very difficult to prove that extra lines of code have any financial impact, let alone one that could be recovered in a matter of months.


I understand your viewpoint, but I think it's a bit too pessimistic. While we essentially don't know how to consistently write good code and deliver quality products on time (the whole industry, save for some niches, has this problem), it's also improbable that nothing we could try would get us closer to that ideal. Not too close to it, perhaps, and not in a matter of months, and not by simply switching one set of rules of thumb for another, but surely, there's got to be something that can have a noticeable impact. Especially over longer-term and in larger codebases.

Trying to replace X with Y, where both have similar expressive power and similar drawbacks (i.e. FP vs. OOP), won't help much. Not on its own, and not without many other conditions being met, including completely non-technical ones like the personality of a hiring manager. Surely, though, in every paradigm or style, it's possible to write better or worse code, right? So it should also be possible to create an environment where the code quality, on average, would be just a bit better than the norm.

I'm not looking for quick gains for a single project, but rather a medium-term strategy that can save the effort required to develop and maintain products. People who claim to know how to "get good code quick" are mostly swindlers, and it's really hard to confirm causality in practice, but we shouldn't give up on finding ways to get better-than-average results in development.


> Chasing this is low value busy work.

I disagree. We might understand what I meant by "style"[1] differently. As I think of it, it's comprised of things that have an actual, measurable impact on the effort required to develop the codebase(s) over time. It's not about tabs vs. spaces, snake_case vs. camelCase, or anything even remotely like that. I'm not trying to establish a company-wide set of guidelines for the sake of it - I believe that relatively minor things (in the scope of a single project) can lead to significant savings at the scale of tens of projects and five years of a maintenance window.

As for the guidelines themselves: I don't care what they are, exactly, as long as a) they're there; b) they reduce said effort; and c) they're followed.


I've seen people take Bob Martin's concepts and do some truly awful things with it. Mind-bogglingly awful contortions of concepts into classes in arrangements that have to be sourced from demonic inspiration.

At the same time, I've seen the best code of my entire life formed from his concepts. Code that will last decades, far outlasting the UIs that feed it data or the databases that will store it.

I think the difference is all on whether the developers who wrote it understood that the "concepts" are not meant to be put into code on a 1 for 1 basis. For example, making a AddToDoUseCasePresenterInteractor class is literally taking the concept and making it 1 to 1 in the code. On the other hand, writing domain appropriate code, minimizing accidental complexity, and recognizing the clean code concepts as emerging from groups of classes and methods and packages in a code base leads to really clean, testable, maintainable, and FAST TO WRITE code.

I think the single biggest improvement for java programmers is to group all the classes related to a use case together in the same package - which means STOP MAKING "controller", "service", "model", etc packages where every different unrelated except by use is just dumped. If you're working on a part of the code base you should just have to change classes in one single folder. A new feature should just be a new folder. That change alone speeds up teams by huge factors.


Your last paragraph is really interesting to me. It obviously makes total sense. Yet, three environment I've used most in my career is Rails which also splits these things into folders by type rather than feature. It's never bothered me. Now I wonder if that because Ruby isn't Java and folders aren't packages or because I'm so very used to it.


I've worked in a few codebases that tried to group things by feature. In my experience, it never really worked that well.

Usually there would either be poor isolation between them or they'd be so well isolated that I'd wonder why they were even in the same project. In the latter, they'd often be difficult to maintain because of a web of dependencies pulled in by the little isolated subfeatures.

I prefer the separation by type, tbh. It also has the upside that it naturally encourages the developer to follow the same patterns within that particular package.


I'm not sure what I prefer because I'm most familiar with Rails. I guess I'm used to type-separation. An individual Rails app could be structured using "engines"[0], which could easily allow for this kind of separation. Each engine will have its own app/ directory which contains models/, controllers/, services/, etc.

The point is that the feature is its own "project" which would likely be loaded in the host application as a gem. I don't think this is actually a strong convention either way in Rails, so it would still be compatible with convention-over-configuration to build an app this way.

(Kinda just thinking out loud. FWIW, I have worked on an application with a similar sort of "engine-primary" structure but not what I currently work on.)

[0] https://guides.rubyonrails.org/engines.html


There's a certain amount of disciplined duplication that you have to adopt as well. I think this hangs a lot of people up and failure to do it right leads to the complications you're discussing.

Adding a todo note, marking it done, editing it, and deleting it are all different use cases and each have thier own package and classes. However, most people want to have some kind of single ToDoNote class which is the "To Do Note". Doing that means they have to decide which package to put it in and then pull it in everywhere else. And then "common" logic starts piling up and features depend on crap from other features and accidental complexity starts creeping in. Now you've got a single ToDoNote class that has different sets of properties null or pipulated depending on where it was instantiated and what use case its being fed to, all requiring the programmer to keep this in their head instead of getting the compiler to help.

The reality is that the set of data you need to create a todo note is different from the data you need to edit it which is different from that which you need to delete it. They share common elements, but never at the same time. The solution is to create "NewToDo", "EditedToDo", and "DeleteToDo" models that for each feature. Sure, some of the models will share a "title" property (new and edit), and some will share an "id" property (edit and delete) but never all at the same time. This offloads this complexity from the programmer to the compiler and speeds up development.

You already know youre operating at the lowest of the low leves of excellence if you rely on monkey see monkey do code standards. Having low complexity code with minimal programmer brain space needed to operate in is how you level up to higher levels.


> If you're working on a part of the code base you should just have to change classes in one single folder. A new feature should just be a new folder. That change alone speeds up teams by huge factors.

This is a neat idea. Have you done this in practice, and how does it work over a long time frame?

One of the big advantages of separate packages is purely for references: The model package has no reference to service or database code, so it's not possible to include SQL or other hidden service calls in it -- at least not without adding a new dependency which makes it blatantly obvious you're doing something wrong.

On the other hand, if your features are self-contained, and you have good unit test coverage of all the logic, then I guess it doesn't really matter as much what the structure is. The fact it's unit tested forces it to be loosely coupled, and testability is one of the main reasons to organize code into layers in the first place.


Been doing this for a few years. If I have an `Item` class, it's going into its own package. Along with `ItemService` (business logic), `ItemResource` (endpoint), `ItemDao` (persistence interface), etc. If `Widget` has a dependency on `Item`, then `WidgetService can either import `ItemClient` or roll its own.

Makes it super easy to split out microservices when the monolith gets big. Just keep from injecting one Service class into another, rely on the Resource or Client instead.


> WidgetService can either import `ItemClient` or roll its own

Can you clarify what is ItemClient in your context?


To the extent I feel the need to do this (in very large projects with many teams) I think it's best accomplished with separate modules. My favorite way is a single repo muktimodule project with the core code in one module, all the database implementations in another, and very small third "main" module that brings them together and has the startup code.

But there are other solutions. You could probably cook up some linters or some other compile time enforcer.


Been doing similar for years... I refer to it as feature oriented structure/organization. To me, that includes tests. I hate that so many code bases are effectively mirrored trees that are hard to break apart.

You can still have effective layers, even classes if you like them. But there's little reason they can't live next to each other on disk in the same project even.


> sourced from demonic inspiration.

Actually writing demonic technical debt on purpose in one sitting would be quite an accomplishment too.


> Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.

That describes just about every codebase I've worked with that relies on object-oriented patterns, which is basically every codebase. This is particularly bad with Java, but as you mentioned, this is a cultural problem and not so much a language problem. I like Java the language, but what kept me away from it was every Java codebase I've seen. Layers upon layers of needless abstraction, overly abstract names, and so much code is meant to describe things rather than a sequence of data changing.

It's not that OO is completely wrong, but it's a meme that we as programmers are refusing to shake. It's like people are still getting taught OO in college courses by programmers who haven't worked professionally in decades, and those students are still going into the real world thinking that everything's gotta be object oriented. And usually what OO ends up meaning is having classes and inheritance and "this" and mutability, as opposed to having objects that pass each other information. The latter doesn't need classes, or inheritance, or any of the other similar features in programming languages. But if you've got to write to a file, then you've gotta make a class that wraps the file system functions, right? /s


Bingo!

I’ll never forget working for a tiny startup under an ex Google CTO in 2013 who wrote plain Java code and chose simple libraries. Saying it was a breath of fresh air is an understatement.

Code is culture and culture is code.


> It seems like there's a lot of people in the Java community who still think OO is a great idea

If you're coding in Java, you've better think OO is a great idea. It's an object oriented language. And despite having loosely bolted on FP paradigms, that's not really going to change.

Although I feel most of the criticism against OO is actually more like a critique of the FactoryBuilderFactoryImpl-style application of design patterns, which is something else and really unfashionable today in Java.


> If you're coding in Java, you've better think OO is a great idea. It's an object oriented language.

There's plenty of room in Java for nice, clean code. All "java is OO" means in practice is that your code needs to be in classes. Some things that work great in java:

- Separates out value types from everything else. Value types should usually be tiny, and have public fields. They should not contain references to any other data, and any methods should be simple. ("getTempInCelcius()" is ok, "updateFromDatabase" is not.)

- Express saving / loading / processing work as (ideally) pure functions which operate on that data. You can use the class keyword + static functions to make a module / namespace in java. Use it.

- Use interfaces sparingly. Try not to use inheritance at all.

- (Controversial): If you find yourself with 18 different classes instantiated at runtime, where all those classes have exactly 1 instance (and probably all hold references to each other), you're doing it wrong. Use fewer, larger "controller" style objects (ideally 1), arranged in a tree (no references should point up the tree). If your classes get big, move utility code out into pure functions or short lived utility classes in adjacent files.

- Don't use factories. If you need to, use value types for configuration data.

Is this still "OO"? Depends what you mean by OO. I've heard this style of programming referred to as "data oriented programming", or something like that. Done well, data and data processing often end up in separate files. (Which is the opposite how OO is usually done). Features can often land in separate folders. Performance usually ends up better, because you have less pointer chasing and indirection. You often get much clearer separation of concerns between classes. And its usually much easier to write unit tests for code like this - since almost all classes can be instantiated and tested directly.

A lot of modern game development uses patterns like this, coded in C++. Most C code looks like this too. I've also read and written plenty of javascript / typescript which follows this pattern. Its very simple in JS/TS because you can use object literals (with interface definitions in TS) for your value types. In rust, your large "controller" style classes with a million methods can have all those methods spread out throughout different files in your project. (Ideally grouped by feature.)


That can be OO. I think in general object inheritance is pretty unfashionable, and while it occasionally does solve problems, it's rarely the first thing to reach for. It can be useful in library code though, were you might otherwise end up with significant code duplication.

Overall the move is toward having data classes (basically records) that are light on logic, as well as logic classes that are light on data. I don't think pure functions are necessary or in many cases even desirable in Java, as it largely lacks the tools required for this not to be crippling; although I will concede that any state changes typically ought to remain local.

Factories can be pretty good for separation of concerns, although in many cases it's been superseded by dependency injection frameworks in modern java. I still reach for it every one in a while though.

Also fwiw, game development is a bit of a weird case. It usually reaches for design patterns like ECS. This is in part a data locality optimization, since you can just iterate through all components in a "straight line", but mostly it's for the sake of malloc, which generally doesn't deal well with billions of objects being allocated and deallocated over and over randomly with growing fragmentation as a result. There are many types of programs this or other game dev patterns aren't suitable for.


You're one step away from loving Clojure... the quasi-official motto is "it's just data", and the resulting code tends to be simple and short.


A lot of smart engineers I’ve known have moved to clojure and fallen in love with it. I think you’re probably right.


> Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.

I have to say I already expected the comments to be good when I read the title but this nugget of pure gold - I would PAY to read comments like this!


I feel like if Java had immutability by default it would be such a better language to work with. It is so hard to determine what actually gets modified where in a large Java codebase.


Too bad immutability breaks the quite-ubiquitous-in-JVM-land Bean spec (and I wonder what one was smoking before drawing that spec).

I think having proper sum-types with pattern matching would have also made it a much better language.


It was designed when UML and tools like Rational Rose was being pushed hard by the big consulting firms on their clients as the One True Way to build Serious Business Software. JavaBeans was designed as a way to have a conventional interface that allows code components to be snapped into GUI applications. But someone thought it would be a great idea to use it everywhere else, too.


Can we all agree that the bean spec was not a good idea? No matter how old, or what other stupid ideas were fashionable back then. I cannot see how making things "beans" actually solves anything (besides maybe a slight reduction in boilerplate when (de)serializing bean-classes).


Java has been around for a longish time. Around the early 2000's there was at least a perception that you should avoid creating too many objects as that carried performance overhead for construction and garbage collection.

https://softwareengineering.stackexchange.com/questions/1495...

Immutable objects often require you to construct a new object to store an updated value and garbage collect the now unused previous object. So a lot of early Java code was written with mutable objects to avoid performance issues.

The Java Bean spec was written in 1997:

https://blog.joda.org/2014/11/the-javabeans-specification.ht...


I didn't say make everything immutable. I just mean that if you want an object to be mutable, you need to say so.


Bean spec is in no way mandatory. Plenty of existing code doesn't follow the bean spec. Immutable Java classes have been in vogue for many, many years.


Java now has sum types (sealed types) with exhaustiveness checking, pattern matching for records, and more general destructuring in the works for classes.


Java's records are immutable.


And most shops likely aren't using a JVM new enough to use records, and it will be a while before they are able to upgrade. And even then, most 3rd-party libraries out there won't use records, because they want to be compatible with the JVMs their users use. Hell, most popular libraries available on Maven Central are compiled with JDK8, or, at best, JDK11.

Aside from immutable instances, it would be nice if 'final' was the default, as well.


On the one hand sure.

But if we're going to critique Java honestly, can't we stick to the language the way it looks today and not the way it looked nearly ten years ago?

Modern Java is quite pleasant to work with. One by one the papercuts and footguns have been fixed.


So it sounds like the fix is available with an update. What more do you want?


Also still extremely hard to use due to basic they are, e.g. there is no "with"-er method or syntax yet to modify existing values.


Withers syntax is being evaluated/designed as far as I'm aware. You can still get some good work done with records in the meantime.


"Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company."

Awesome quote ... LOL.


Nobody writes FactoryBuilderFactoryImpl as a class name, they define an XML schema and then use a code generator to create the source files... 1/2 /s


That was the old way. Now we ask ChatGPT to generate it for us.


Had me in tears lol


Someone has to link this here, so it may as well be me :)

https://steve-yegge.blogspot.com/2006/03/execution-in-kingdo...


I've spent most of my programming life working in OOP. I see people critical of it, but I don't know what the alternative is for the kind of stuff I do for my job (not to say I have a choice in changing how we do it). Does anyone know of an open source project that implements a complex GUI app that doesn't use OOP so I can see what that code can look like?


We even use OOP in Javascript today, although capsulation is still a but sketchy and there is no support for interfaces and abstract classes. There is support for interfaces in Typescript, which was added because, well, it makes a lot of sense :)


Any functional react application would fit the criteria I imagine.


Do you know of any well written articles of critique against OO? I've read a few against Clean Code, that actual went over the problems with the actual examples given. Nevertheless I hardly ever see good critiques against the principles themselves.

I'd be interested in reading about it in more detail.


https://kyren.github.io/2018/09/14/rustconf-talk.html I rather liked kyren's keynote. It's particularly focused on games, but some of the same issues arise in other complex apps.

"Rust highly rewards data-oriented design with simple, understandable ownership semantics, and this is great news because this is a really good fit for game development. I suspect this is also true generally, not just for game development! (but what do I know?)"


In my mind there are two major issues with java/c# style OO.

1) Subclass universality isn't great for cognitive comprehension

2) Subtyping component of subclassing has no compiler enforcement

Okay, for the first point, consider the universal NAND gates. You can build any circuit that uses not, or, and etc logic gates by converting it into only nand gates. This is great for manufacturing (I guess) where you can 'compile' a straightforward series of logic gates into just nand gates and then you only need to product one kind of gate in your hardware. However, imagine if you needed to program this way (say with an imaginary nand (!&) boolean operator):

  today || tomorrow => (today !& today) !& (tomorrow !& tomorrow)
Yeah ... just rolls off the tongue.

Now, subclassing (and interfacing) provides an existential (to borrow from logic) property. That is, there exists some class such that the following methods exist (and maybe there's also some default implementations spread out amongst every super class in the inheritance hierarchy, but focus on the simple case).

Existential is also universal. You can use it to simulate a forall property (think generics), an or property, or an existential property. The problem is that now you're converting what would be a straightforward structure in a more fully featured language (like unions, or ADT, or Sum types, or discriminated unions) of This thing OR That thing into something much more complicated. It might be an infinite number of things which delegate logic that should exist right here into some other class someplace else where you cannot see it without difficulty or maybe cannot see it at all.

Or you can just cast the super class into a sub class ... which isn't great either.

Regardless, you also have to hope that nobody goes off and implements another case that shouldn't exist because subclassing is open where an OR property in something like discriminated unions is closed. Knowing you have all of the cases considered nigh impossible.

Now, this is good when you straight up want an existential property. It does happen that you get a thing and then you call a method on that thing and there really are an infinite number of things that it could potentially be both now and in the future such that you want support for that behavior. However, I assert that this requires much more complicated and careful coding and isn't applicable for most of the work that ends up needing to occur.

Part two is a bit more simple. When you subclass you're also declaring a subtype. The problem is that there's no compiler support or assistance to ensure that all subclasses are actually valid subtypes of the superclass. But it's a property that you get whether or not it's true.

So at any point in time you can have some object where you aren't supposed to know it's actual class, but for which if you don't know it's actual class you'll end up writing incorrect code. A superficial example can be had with exception (a whole other topic that will otherwise not be covered here). Imagine a Storage class which is just a key-value store. The CloudStorage class can throw NetworkExceptions, the InMemoryStorage class can throw OutOfMemoryExeptions, and the FileStorage class can throw FileNotFoundExceptions. Code that handles just Storage doesn't know which exceptions it might have to make sure it catches. The subclass isn't necessarily a subtype. [Of course you can open up a different discussion here about the appropriate way to handle exceptions, but I hope the simplified example here makes clear what the issue is. A more complex and realistic example can be constructed to show the same issue in a way that completely bypasses exceptions.]


I cannot dislike "clean architecture" enough.


'git reset' preserve us from acolytes of Uncle Bob!

Clean Code makes just as much of a mess in .net land too.


>"It seems like there's a lot of people in the Java community who still think OO is a great idea."

Java is not the only language in existence and the OO is a great idea as well as many other paradigms when not being overused. Problem is with the programmers who learn one language / paradigm and would fight tooth and nails to solve all world problems with it no matter how poor for particular case.


Objects are a useful idea. Object Orientation is a terrible, terrible idea.

Functions + data (immutable when practical) is all you need for 90%+ of programming, and you should only reach for objects when necessary to manage some well-encapsulated but tricky state. Your default orientation should certainly not be towards objects.


>"Your default orientation should certainly not be towards objects."

My default orientation should be what my experience tells me is the best for particular case.


You should trust your judgement. But good judgement comes from experience. Anyone who only has experience programming in one language, or one style, will have rubbish judgement.

Go learn Haskell, OCaml or Rust. Write some C for embedded platforms. Make a video game using your own hand spun ECS in C++. If, after all that, you come back and say "yeah lets use a class here", then I'll trust your judgement.


I am paid for designing and implementing products. Language for me is a screwdriver. Some convenient some not that much. I have programmed in many languages. I am familiar with the concepts but not going to learn OCaml or Haskel as there is zero ROI in it for me.

>"If, after all that, you come back and say "yeah lets use a class here", then I'll trust your judgement."

Look at it this way. I care what my customers say about my products because this is how I make my money for decades already. You - trusting my judgement - I do not give a flying fuck. Sorry for being direct.


Sorry - I just reread my reply above and I can imagine it reading more personal than I intended. I apologise.

The point I was trying to make is this - I’ve worked with plenty of engineers who want me to trust their engineering opinions. Some are experienced. Many are not. I have no idea where you sit on that spectrum - I don’t know you; obviously.

For what it’s worth, I think it’s the right call to trust your own judgement. I just don’t think enough people actually nurture their judgement by exploring and experimenting with a lot of languages and styles. After all, how would a self proclaimed “Java programmer” know what problems you can solve more easily in Python? I don’t think you can truly understand the strengths and weaknesses of OO (or any philosophy really) unless you spend serious time embracing other approaches.

Again, maybe you’ve done that and maybe you haven’t. I don’t know you.


My previous reply says "...I programmed in many languages..." and "for decades".


> I think the problem is cultural, not technological.

I'm sure that's true. You don't have to write classes with names like DoubleFactoryFactory; nobody does that in other languages. It's a tribal thing.

What's amusing (to me) is that Java isn't actually object-oriented, because many of the core datatypes aren't objects.


I have similar feelings on the .Net space as well. It's not the language (C#) or the platform so much as the "Enterprise" community.


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

The IDE that's written in Java most likely shares nothing with Java ecosystem parts they "hate" (I'm guessing all of the EJB, applets an general 90's Java enterprise stuff). Only irony here is you thinking it is.

The 90's Java was a thing to hate. Running fat complex app server with bunch of code-as-XML for configuration just to run simple API app. IDEs are not doing that (althought Eclipse was a bit of a monster of glued in parts)

> Did anyone in the mid to late nineties / early 2000s really discover Java and Java applets and thought: "Java applets is the tech that'll catch on, I'll invest in that" and not invest in Java itself?

...entirety of management it seems, it was the most popular hammer for the problem for quite some time in enterprise. Hell, we still have some less than decade old servers needing java applets for KVM


Ironically, deployment, management, monitoring and security of a single (even clustered) EJB app server was a much more straightforward affair that today's cloud based Rube Goldberg machines that I encouter in practically every "modern" enterprise app.


Seriously, I work in ops with a side of coding and in order of "most pleasant" to "most annoying":

- Go and similar "one binary". Bonus points if static files are also embedded

- "Just a JAR with few parameters"

- Anything that comes pre-packaged in OS. Perl with OS-provided libs.

- Enterprise Java with code-as-xml for config. More pleasant if we're not the ones managing the XMLS

- C/C++ apps with reasonable deps that don't need re-compiling half of OS

- Bare containers. Shit visibility without a lot of tooling but at least we don't have fight deps

- k8s. Complexity of system rarely pays off with the relatively simple apps our devs are deploying and need even more tooling for good visibility

- Ruby outside of container - just fucking kill me. A bunch of requirements for libs in system just to compile "gems" (why those cunts can't call it libs like everyone else), long startup, leaks memory, took them 10+ years to make a web server that's not completely shit...

- Python - if I see something is in Python I try to find alternative solutions. I swear half of the install instructions either just don't work or shit all over home/system dirs. The absolute clusterfuck of py2-3 migration didn't help either, thankfully it appears its finally over. Bonus demerit if instructions mention setup.py. Tho to be fair I didn't tried recently, maybe it is better


I have been explaining to folks how OpenAPI is basically WSDL-lite. And is building up to be as big, at their current velocity. Kubernetes and friends, it is hard not to see as non JVM Glassfish servers. I remember WAR files being far easier to reason about than the current bundle setups we have today. SAR files were an amusing stab at the same thing, complete with the server managing deployment/startup order.

I still don't want to go back to that time, mind you. Is probably why I'm so hesitant on the current attempts.


Yep. For small projects, why is it so hard to have a "deploy to server" button next to "run locally" in my IDE?


For the past several years, the Java enterprise development is very, very easy compared to the mess that is node.js nonsense. And if you are not happy with the JEE environment, you can continue to use frameworks like micronaut, quarkus etc.

For instance, in Quarkus, you can have a full REST application in about 12MB of RAM and running on the Graalvm.

In Spring (often looked at a heavy framework), this is the code required to write a full REST endpoint:

@SpringBootApplication @RestController public class DemoApplication {

@GetMapping("/helloworld") public String hello() { return"Hello World!"; } }


> The IDE that's written in Java most likely shares nothing with Java ecosystem parts they "hate"

Most vocal java haters I had met were JavaScript programmers who knew basically nothing about Java. Their hate was purely cultural - they learned on discussion forums that they should hate java. But, they had literally zero knowledge whatsoever.


> purely cultural

oh, I got one to admit he was "intimidated" by Java.

Note: there is a subset of developers who have either hit one of Java/JVMs limitations, or, subscribe as a matter of professional judgement and taste to a school of programming (say FP) that is not a good fit to Java and JVM's orientation (classes, OOP). This subset is an informed subset and their feedback is reasonable and at times has helped Java/JVM move forward.


I don't particularly like Java (but I did love it back in 98 or so when I came across it, coming from C++), but it's mostly due to the proliferation of OO monstrosities created by architecture astronauts. The language itself is ok, if too verbose. The JVM is awesome when you use something like Clojure on top of it.


It’s funny but a lot of JS haters are Java devs. Which also don’t know a thing about JS, or at least think JS looks like in the 90s, with callback hells and whatnot.


I know both Javascript as well as Java. Java trumps JS by a mile


>> bunch of code-as-XML for configuration

Code as json or yaml is fundamentally the same while being nicer to look at (yaml at least).


It's in a way more terrible as it usually is templating language + YAML

So you need to navigate pitfalls of YAML (or bear ugliness of JSON) all while learning a templating language.

All because someone decided "Users will not want to learn Python for DSL, let's give them YAML"...

Every single tool that went that route has that problem and every single one would be better served just picking <language> (Lua is nice and embeddable, just saying) and using it as DSL.


It's a step back in a way. Code as XML had DTD with on the fly validation and autocomplete. JSON and YAML are still playing catch up in this area.


I have lots of biases against XML. For me it is nearly personified as Steve Balmer. However, it is rather funny to see people who hate XML yet love html (or even more absurd XAML). The autocomplete feature makes XML somehow not completely horrible as you can easily define a schema and get a kind of mini language with zero effort. Eventually the mind removes the tags and you are just left with the content. However, I see text formats that describe tree structures are all fundamentally the same. It is impossible for json, yaml to be waaaay better than XML as it is the exact same thing.


> The IDE that's written in Java most likely shares nothing with Java ecosystem parts they "hate"

IDK, I think the irony is you thinking IDEs wouldn't use those patterns. I bet you'll find tons of weird interfaces and abstractions in IntelliJ. It is a huge and complicated code base.


I assure you code as XML has not disappeared from enterprise Java.


It has not disappeared from enterprise and this has nothing to do with Java. XML is still very good as a document format, with powerful validation capabilities (XSD) and powerful language to do transformation of documents (XSLT), searching documents (XPath). It is still used for a reason.

In addition there are standard ways of encrypting/signing XML documents.


Those aren’t the parts that are used though - it’s the Spring XML as the replacement for the “new” keyword that is more likely to be found than XSLT or XSD.


> My fellow dev often laugh when I tell them that instead of looking at all the long dead techs that are not useful to me anymore, my way to feel good is to look back at all the long dead techs that I didn't bother to learn.

I share this sentiment. All my adult life I've been in contact with the "tech-anxious" kind of dev that needs to be up-to-date with all the new frameworks all the time. While I've entertained several new trends, I found it counter-productive to my well-being to focus on more than 1-2 stable stacks at a time. There will always be an order or magnitude more failed frameworks than successful and this perspective keeps me healthy.


IMO an argument against this is that the benefit though is not in the new tech itself, but different design paradigms it exposes you to and thought patterns you learn.

Even if Rust were to die off tomorrow I’d be happy I invested my time in getting familiar with it as I’ve learned quite a bit about memory management and ways to reason about it.


Paying attention to the design paradigms is truly the interesting aspect. The way you write code in a language isn't fixed or proscribed. You see evolution of languages/frameworks, influence between them, repeats of what one tech has tried and failed (sometimes the new tech tries again and succeeds). The great thing is you can take those learnings back to the language you're using! If you mess around with some Angular or RxJS front-end stuff, you can take that style to the back-end if you want. If you've seen benefit from strong typing on the back-end, you can add TypeScript to a front-end project. Rust and Kotlin have even influenced Java core libraries for the better.

By paying attention to the designs it's easier to figure out what's marketing-driven hype as well. For example, do I really need to jump onto the next serverless platform, or are they just repackaging and upselling AWS Compute resources? Can I accomplish the same thing as the latest edge computing platform by using service workers and a free CDN? Is Firebase Hosting basically a second interface for GCP Cloud Run?


Around here, e.g banks are big on Java, and banks are not going away anytime soon, nor is their Java code, which represented huge decade-scale efforts to move away from COBOL on IBM machines (both of which seemingly not going away either). And that's only one data point.

Sexy? nope. Hot new thing? neither. But the jobs are out there.

Similarly that mention of RoR being "in danger" seems incorrect. It's just that Rails is not the topmost big player in the market, and I don't think it ever was, even at its early peak. Sure it made rounds in the news and had a big impact but even back then it certainly wasn't The Tech That Everyone And Their Dog Uses.

That's fine. A tech doesn't need to be in the top 3 market leaders to be useful, successful, and lively.


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

Reminds me of so many devs hating on Ruby and the number of times I heard that Ruby isn't used for anything relevant, while the huge majority of them either use GitHub or Gitlab for their work.


Does anyone say that? It's out of vogue but surely there must still be plenty of RoR about the web. Shopify too.

Ruby not on Rails 'not used for anything relevant' is maybe a bit more defensible. Not sure it matters/what the point is though.


> long dead techs that I didn't bother to learn.

this is how I currently feel about Kubernetes and docker. I'm having all sorts of fun with the JVM, a monolithic jar, and about 20 lines of shell script. I can deploy in 20 seconds without much fuss.


Absolutely. Java development is ridiculously easy compared to many of the alternatives.. in some senses, it is actually even easier to develop in than PHP especially since Java is statically compiled.


plus, it has super powers. For example, with Adama ( https://www.adama-platform.com/ ), my language translates to Java on the fly, compiles, and class loads to provide a new kind of serverless infrastructure.

The "cool" kids are building WebAssembly while the JVM has it all.


Conversely, I’ve often heard people say they like working in tech because they’re “always learning new things!” It seems pretty clear to me that most of the stuff I learn in tech is a total waste of time, except for the fact that I need to understand it to get paid. It feels as if I’m learning how to work on puzzles in a book. (eg, Sudoku, or other logic puzzles.) It might be interesting to work through them, but they don’t really have any value in my life.


> my way to feel good is to look back at all the long dead techs that I didn't bother to learn.

That's how I feel about design patterns. Talk about the most useless pile of academic bullshit that gets overemphasized in terms of what "every programmer" must know before going into an interview. Although there are some domains where they may be necessary to know of the top of one's head, for the most part there's usually no utility in filling your head with individual ideas that worked in one circumstance but may lead to bad patterns in another. I'm so glad I gave up memorizing design patterns, especially given that many employers seem to have lost interest in trivia questions about design patterns as well.


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

Don't forget Confluence and JIRA and all Atlassian products. Minecraft too. I facepalm when people say Java this and that


Minecraft is actually a really good example of the opposite. Maybe some might hate the Java edition of Minecraft because of its frequent GC pauses, but the new performant Minecraft Bedrock written in C++ isn’t really that popular because of lackluster mod support.

The reason Minecraft was able to have such a large mod ecosystem (even when it still doesn’t have an official modding API!) is that JVM bytecode is easily reverse-engineerable and extendable (…relatively, compared to fully compiled C++ code). Bedrock actually tried providing a modding API via JS scripting, but it was pretty lackluster in its flexibility and thus users mainly stayed in Java edition for the mods (using robust ecosystems like Spigot/Paper)


Who are the people who belong to "I hate Java" and "I love Jira" sets? I'd love to see that Venn diagram.


Do people who loves JIRA exist for real?


Worked at AAA game developer. They had shit ton of custom workflows build on top of JIRA.

Imagine you want to add a weapon to the game. With one click you could generate 100s of tasks for all the related stuff. It touches pretty much all the game. This is very very incomplete list for what Jira created tasks:

- concept, 2d, 3d art

- sounds, and there's quite a bit of them: just gun sound, reloads, impact on different surfaces etc

- animations - this is a large one

- writing: for example, background info on the weapon

- gameplay design: how the gun fits into the game

- world design: where the gun can be found, who (some fraction?) uses it

- quest design: maybe the gun is a reward in some quest, or is used in a particular way

- balance

- obviously, programming for all the above

- and even more obviously, testing all of the parts.

And that's one workflow. You had those for many parts of the game.


I wouldn't say I 'love' it, but at the same time I don't really get all the hate. I've recently moved from a large'ish (50+) dev shop to a much smaller team of 3. JIRA was and is used at both places. It continues to serve both orgs equally well.

Perhaps what people hate are all the ceremonies and bureaucracy that arise around JIRA in dysfunctional organisations, and not the product itself? The red tape and bullshit was just starting to take over at the old place (part of the reason I jumped ship), which made 'standups' that should have taken 5-10 minutes a 90-minute soul-destroying odyssey. But that's not JIRA's fault either.


What's not to love?

Infinitely configurable and extensible system that can accommodate any workflow and any process that comes to mind.

It can be used and also misused, but blaming Jira for its misuse is like blaming Lisp because it's too flexible and too powerful.


Indeed. Blaming Jira because it's missing this or that feature, or because the managed version is being slow is fine. Blaming it because it's an implementation of the twisted processes of your organization it is not (and that's what most people usually do, even if unknowingly)


That's exactly the problem, we need a JIRA that is quick, efficient and not a death trap of a corvette for a 16yr old mental adult. Pretty please....


linear


That's precisely the problem, that simple workflows need infinite configuration.

But there aren't truly great products on this market, that every organization does things differently puts limits on what is achievable.


Sure. Get out of HN bubble, go attend almost any Atlassian event -- their main user conference has thousands of users attending -- and you will easily run into people who love Jira and/or Confluence.


if you have a complex workflow that is supported by Jira, then you could use a simple too by switching to a simple workflow.

Keeping the complex workflow but using a dumber tool (E.g. having to track it via post-its, emails, chat, six different spreadsheets on different SharePoint servers plus sign off in two different custom in-jhouse webapps, would be worse in every aspect).

So when people say they hate Jira, they really hate the combination of Jira + the workflow under it.

I'd happily live without complex processes. But IF I have to use a complex process, I do love having one tool to handle it with, instead of eight. I did this switch FROM the 8 different spreadsheets and webapps, into the Jira/Azure DevOps/Whatever, several times. And I loved it every time because it's a less bad solution. It's not a good solution (that would be reducing complexity in the process). But for sufficiently complex organizations and tasks, some times you need a complex process and a complex tool to maintain it. And I guess in that situation no one will love the tool even though it's the least bad one.


Yes, hello felow developer, every alternative I used in the last 30 years was worse in enterprise integrations.


I agree, it's not that Jira is good but the alternatives were worse


> Do people who loves JIRA exist for real?

I wouldn't say that I love JIRA, but every time I work at a company that doesn't use it, the alternative in place always feels much worse.


When the alternative is Azure Devops...


I believe some managers do.


People don't hate Jira because it's written in Java. They don't know or comprehend that something like Jira or Confluence or Minecraft can be written in Java.


Jira does, somehow, faintly smell of Java. There's just something about how slow it is, and how maximalist it is in supporting every whimsical idea that any rich corporate customer has ever asked of them. No matter how poorly the request blended with everything else Jira does.

Mind you, I've only ever used Jira when its been setup, poorly, to fit in to a badly designed "scrum" process. People say it can be usable if you turn most of its features off. But the same is true of "smart" appliances, and that doesn't make me want to buy any of them.


Every company I worked for eventually used Jira, and it's always set up poorly to half-support a badly designed process. At this point that's just a regular Jira installation, I guess.


Java isn't slow, quite the opposite in fact.

https://www.techempower.com/benchmarks/#section=data-r21


Don’t worry, enterprise Java programming patters will fix that “performance” thing right up. Call a function? Goodness no, not without an helper class being instantiated first. Oh you wanted to instantiate a helper class? Better call a factory to do that. Not directly - have you heard of Dependency Injection? We configure that via XML here.

See? Fast performance isn’t a problem any more. You can make it go away in the time it takes you to load Jira.


JVM startup time is how it got that reputation, I'm guessing benchmarks tend to exclude that.

Back in the early/mid 2000s, Java-based applications often had a Java splashscreen as it loaded, so even non-technical people made the association.


Indeed. We moved off of JIRA years ago, but not that many years ago that this isn't still funny every time I think of it:

    veidr 2016-08-22
    Haha, JIRA is very "enterprisey":
    > Unable to find source-code formatter 
    > for language: python. Available 
    > languages are: actionscript, html, 
    > java, javascript, none, sql, xhtml, xml
That's me commenting, but I can find it easily whenever I want just by searching my company's chat for "XHTML".

We were using their hosted service, and it was a dumpster fire inside a meth lab...

(I did have a fine experience with JIRA before, though I want to say it was 20+ years ago (?) but my takeaway was that you needed a full-time JIRA admin person (and of course back then "on-premises" wasn't a thing, because it was the only thing...))


jira's slow because it's a badly written piece of crap

requiring 20mb of data transfer for each page load isn't java's fault


Java is not slow.


Yup, Jira is indeed very hatable before knowing its implementation details.

I don't know if it's a strong argument for Java however, saying that god-awful software written in it is universally despised, but for reasons other than the language choice.


Saying that a software is god awful when it's commercially successful is not a good argument. Certainly tons of people and orgs out there are deriving value out of it. I still have to see anything that comes closer to Confluence for arranging organisational knowledge


Do you imply a positive correlation between quality and commercial success? I don't think I can name many instances of this, really.

E.g. by far the most commercially successful video games are the bottom-of-the-barrel casinos for kids.


I am implying that there's evidence that more people like it and are willing to pay for it than people not finding it upto the mark. Yes, that's the definition of a successful software.


How would you explain SharePoint if this followed?


Oh, how I hate our management for forcing SharePoint on us.


Software success seems to be 90% about advertising and finding innovative ways to destroy your competition. Microsoft are the masters of that approach.


Unfortunately that won’t be possible, since the “I love Jira” set is empty :)


You can absolutely tell Jira is written in Java and not in a good way. I guessed the first time I went through one of the '...' > 'Move'/'Clone'/'Convert to subtask' flows.


They don't hate it coz it's in Java...


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

I'm not sure why this is a salient point. There's no contradiction in hating Java and using an IDE written in Java. Lots of people hate C, too, but that doesn't make it strange that they use lots of software written in it.


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

And using AWS that has most services written in Java.

Java improved a lot in the last few years and it’s not going anywhere.


I think Java virtual threads and structured concurrency are going to only improve its standing.

People complain about Java, but then it could be much, much worse. What we have is a rich open-source ecosystem, great backwards compatibility and carefully considered new features.


Java, COBOL, C#, JavaScript, etc... all tech that's not "sexy" but are the backbones.

Even dead languages like COBOL aren't dead.

These techs are the tech that'll be here in 40+ years - or atleast until Skynet rewrites them in binary and we're nuclear ash. I think COBOL will be driving the satellites that monitor the heat death of the universe.


I remember about 12-14 years ago I was deciding between learning Python vs. Groovy. I didn't know anything about programming then and for some reason at that particular point all articles/advice I found were saying the future is either Python or Groovy. Luckily I ended up going with Python :)


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

Didn't they also make Kotlin so they could use the JVM without Java?


> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.

I don't use an IDE (or its not written in Java, depending how you define it) but I think I can reliably tell a Java app - if not before - when it gives me an error.

They always (or often - I can't say it's definitely not Java when I haven't identified it) seem to spew stack traces that the end user doesn't care about, or in many cases (not an IDE) wouldn't even know what it was.


What is most funny to me in hindsight is how much I had to fight off warnings and offers to learn Macromedia/Flash development... I literally had tons of people who were only making $45k a year telling me I was a dinosaur while I was making $70k investing in PHP dev...

Fast forward to now and PHP jobs are paying over twice the old rate and grinding along while Flash is long dead and buried. Follow your instincts, not your friend's advice. :P


I'd rather claim that the main reason is that the author seemed drawn towards proprietary tech stacks that the companies behind them wanted to fully control and that contributed to their demise. It was clear from the start that Flash and VB and ColdFusion and all that wouldn't last long, just because no open ecosystem could form to keep them adapting and vibrant.

I feel that's substantially different with Python, Go, JS and ... C/C++.


I love go, but still, one of those things is not like the others.


This comment helped me realize that part of what I instinctively dislike about golang comes from ColdFusion PTSD. I just realized that golang reads to me like ColdFusion in the worst ways, including and especially its terrible approach to error handling.


I doubt though that error handling makes or breaks a language. How the ecosystem works seems to be what counts, and as long as Google doesn't close it down any further, it can stay with us for a long time.

I agree that Go is special in this list. If Google pulls the plug someday, then it would all depend on whether a strong community takes over the compiler and language maintenance itself.


Error handling absolutely breaks the language for me. I understand why many don't like exceptions handling in many contemporary languages like C#, but I think that if you are going to do a sum type instead and force handling errors as a result type from functions you could at least use a proper Either monad and some form of monadic binding to remove boilerplate and enhance composition. (This was a problem with ColdFusion too. I can go on at length about some of the terrible "circuit breaker" boilerplate in that language and how tough it made handling errors correctly without killing entire applications.)

As for the ecosystem, most of what I've seen is a language that seems built for supply chain accidents (random github.com main branches as far as the eye can see) built by a company with a track record of developer tools that provide terrible developer experience that live in their own little bubble divorced from most of the rest of computing. (These are also things that seem very familiar to me from my brief horrifying time with ColdFusion.)

If golang works for you then that is great, but it upsets me to look at and right now you'd need to pay me a lot to have any interest of maintaining code written in it. I have enough scars from terrible things like ColdFusion on my resume.


> Error handling absolutely breaks the language for me.

I think you missed my point. This is not about whether I personally like go (I don't) or its error handling (I don't). It's about which aspects, historically, tend to lead to a language's demise over a few years. The claim is that error handling is not one of them.

C isn't great with that either. It's still heavily used.

The supply chain issues are more relevant, but the key question then is whether the language and ecosystem evolve to avoid this. Golang has shown to be flexible and generics were unthinkable not too long ago. Now they are there.

Neither golang nor C are my favorites, and you can argue all you want why golang is not for you, but that misses the point entirely.


We are definitely having two different conversations.

I don't have much to say on the possible longevity of golang or not. Longevity is sort of guaranteed as soon as you write code in a language used in Production. Today's hip programming language is always tomorrow's terrible "legacy code" language. Someone has to support that long after the fact, whether or not the language itself ever remained in support or "alive". That sort of longevity is inevitable, per the law of averages.

I'm mostly just commenting on the ways that writing ColdFusion, in a brief window where it was a supported "alive" language, actively "hip" among certain types of baroque Enterprise development and developers, it still felt (to me at least) like prematurely writing "legacy" code. Single (arguably bad) vendor, bad error handling leading to lots of ugly boilerplate, bad development tools, bad ecosystem, et cetera. That's a lot of orthogonal concerns compared to longevity. ColdFusion had a longer supported longevity than is often credited for, but that doesn't really give a sense of how much it felt like a "dead man walking" even in all those supported years. ColdFusion to this day has a massive "legacy" longevity that is likely invisible to much of HN, but still likely fills plenty of ugly legacy code niches in dark matter Enterprise development even for a dead language, even despite it being a dead language, but the difference between today and when the language was "alive" doesn't feel all that dissimilar because it was always a zombie language in a zombie ecosystem.

Golang seems to me another "born (un)dead" language in that way, that even while it is actively supported certain things about it feel to me like "a dead language walking" (and all code in it feeling like "legacy code" even as it is written) and I think that's what I've been trying to find the right words to say for months now of seeing Go code in increasing places and having a bad gut reaction to it.

In that feeling of "born (un)dead", I don't see C there at all. It's impossible to argue that it isn't an alive language, with plenty of multi-vendor support and nearly always has been (even if it is definitely not the best language to work with in 2023). (Objective-C, rather than C, is easier to argue has had many decades of seeming "born (un)dead". For another modern example, I think you can make a case that Swift also fits closer to "born (un)dead", even though I like its error handling better, mostly, I think it seems healthier than Objective-C in some ways, and I don't hate looking at Swift example code in the same way that golang gives me the heeby jeebies.)


> Java applets were never that big. They didn't work very well (for the reason you mention) and weren't ubiquitous. They also nearly all looked like shit.

Not sure if it's the same thing, but... when i was 16 or 17 i did tech support for an uncle who used to do trading with a trading client application written in java and some other tech stuff.

The java thing was launched via a desktop launched and the splash screen i later found out being java web start (.jnlp extension iirc?).

I didn't see the value at that time, but now that i think of it, java web start was a nice solution to manage "apps" installed and also fetch updates, automatically. IIRC there was a setting to always download the latest version, if possible.

Needless to say, this was ahead of its time.

As I move forward in my career i always find some new interesting thing about the java virtual machine, it truly is a marvel of technology.

The latest things that got me amazed were the flight recorder and mission control toolbox (very cool to be able to see in realtime what your application is doing and why it's performing the way it is) and the new garbage collectors (we're currently using G1, but i want to propose testing Shenandoah or ZGC).


Even more than Java Applets, I remember looking at Java Beans and being heavily, mightily underwhelmed.


I had a job coding Java applets for higher ed when Java was still in beta.

A Java applet could be small and fast loading but that required careful scoping, design, and coding. In JDK 1.0 you didn’t have JAR files and no serialization built in the platform. I wrote a program that would let you edit finite element meshes and submit to a FORTRAN code running as a CGI and the serdes code was about as large as the rest of the program.


> Java applets were never that big. They didn't work very well (for the reason you mention) and weren't ubiquitous. They also nearly all looked like shit.

It's also entirely possible you were encountering them more than you thought. My highschool website, for example, had around 4-8 on the homepage - minimalistic applets that simply showed and cycled through images, placed in a grid in the header area. They totally could have been javascript instead, you couldn't tell the difference just using the page.


> Did anyone in the mid to late nineties / early 2000s really discover Java and Java applets and thought: "Java applets is the tech that'll catch on, I'll invest in that"

I did. Getting people to install JRE didn't really seem that hard of a problem, and neither did sandboxing.

Around the same time Rockstar Games released GTA 3 with its third-person camera, and I thought this is the worst idea ever. 20 years and 400 million sales later, I still hate it.


Yep - Applets never really worked. The problem was the interface between the browser and the applet was buggy.

I wrote a very simple applet in the early days - it leaked memory - nothing I could do.

Also the deployment story - with a single shared version of the JVM ( at a time it was moving quite fast ) in the browser - was an issue.

Quickly decided Applets were dud, but Java itself was fantastic. It's easy to forget what the other options were at the time if you wanted to write complex servers or cross-platform desktop apps.


I agree with most of what you're saying but some java applets were quite big, just not for everyone. RuneScape is the first that comes to mind, some websites that had lots of flash games also had quite a few java applet games, although I don't remember their names. There was also a google-earth-like program that ran in an applet, although I don't remember its name.


I'm not sure why hating Java means I'm supposed to hate things written in Java, or where the irony is there


> Many may not find Java sexy but the JVM is one heck of a serious piece of tech, with amazing tooling available.

Not only that, what about the guest language folks spitting on the Java libraries and runtime infrasture (written in a mix of C++ and Java) that makes their shinny snowflake possible to start with?

Same applies to the C# bashing on the CLR side.


Definitely agree. It’s not “the code has been deprecated”, it’s “your attitude has been deprecated”. One of the big challenges with technology is staying current because it moves so fast. We’ve all seen trends come and go, even outside of software. If it wasn’t evolving we’d be doing something wrong.


I learned Java Applets in school in 2009 and it was clear to me then that they were sort of… awkward. Java itself is fine and I still do a little Java work from time time and it’s a breath of fresh air from the daily python work.


irony of that irony is like celebrating you crapped your pants just a bit less than the next guy


I feel like this is heavily connected with the idea of legacy. I grew up in Scotland, and lots of the buildings, culture, etc, have been around for hundreds of years. It would be nice to feel like something I was building would last as long and outlive me. It doesn’t though.

Sometimes I think that this is just the nature of software development. Most of the stuff I build is built to solve an immediate business problem. It probably lasts 5/10 years and then someone rewrites it in a new language, or more often the task isn’t relevant anymore so the code gets deleted.

I find myself thinking that maybe if I’d been in civil engineering or something then I’d be building stuff that lasts, but speaking to people who’ve worked a long time in construction has taught me that it’s the same there. Most of the buildings that go up, come down again in a few decades once regulations/fashions change or the new owner of the site wants something else.

Every so often something like a Cathedral gets built, and those get built to last. But most people don’t get to work on those. If there’s a software equivalent of a Cathedral then I still haven’t found it.


People are commenting core OS libraries and kernels, but the Space Jam movie website from 96 is still up. I bet the guy that wrote that didn't think it'd be around nearly 20 years later, I hope it never goes down.

https://www.spacejam.com/1996/jam.html


It's comforting to think that 2000 years from now, all that may remain of the early web and modern culture is the Space Jam website. Maybe Space Jam the movie will be looked at as our Gilgamesh or Iliad.


And the Galaxy Quest site -- will people in the future know it is a parody?

http://www.questarian.com/


Well that's just disappointing:

> Greater then [an error occurred while processing this directive] Pages requested since December 28, 1999


yes, because an LLM just trained on this conversation.

gazelle battery chair figment


> nearly 20

Nearly 30. Sorry :-)


From the outset, the framing of a Web site as staying "up" (or going "down") is definitely an accessory to the mindset that leads to the effect observed. Rather than regarding Web sites as collections of hypertext-enabled publications like TBL spelled out in "Information Management: A Proposal", we have this conception of a Web page being something like one part software, two parts traditional ephemera. But they're documents. Merely observing this difference in framing and being conscious of the contrast between intent/design and practice can go a long way to addressing the underlying problems that arise from the practice.


Funny that the original Space Jam website can be considered the internet's equivalent to a cathedral.


That's until there is someone to keep it up. Will disappear the moment the domain isn't renewed.


https://imdb.com started in 1990.


Sqlite intends to keep their cathedral intact until 2050.

https://sqlite.org/lts.html


I never saw this page before. Brilliant! Like catnip for nerds.

    Disaster planning → Every byte of source-code history for SQLite is cryptographically protected and is automatically replicated to multiple geographically separated servers, in datacenters owned by different companies. Thousands of additional clones exist on private servers around the world. The primary developers of SQLite live in different regions of the world. SQLite can survive a continental catastrophe.
SQLite can survive a continental catastrophe!!!


SQLite is awesome. TIL that its database structure is robust enough to have been chosen as one of only 4 Recommended Storage Formats for long term data preservation[0].

> "Recommended storage formats are formats which, in the opinion of the preservationists at the Library of Congress, maximizes the chance of survival and continued accessibility of digital content.

[0] https://www.sqlite.org/locrsf.html


Along with... xls.


I'd say something like TeX might be a software cathedral, and even that one isn't going to last much longer than Knuth himself (almost everyone today doesn't run TeX, they run a compatible software platform, some are even entirely different).

But even a Cathedral changes over time, and your work may not last; but all human work is a shrill scream against the eternal void - all will be lost in time, like tears in rain. The best we can do is do the best with what we have in front of us. And maybe all the work you did to make sure your one-off database code correctly handled the Y38 problem back in 2000 will never be noticed; because your software is still running and didn't fail.


pretty sure most latex papers and chapters are formatted with tex82, though translated from pascal to c


Most people use pdfLaTeX or XeTeX, which may have some basis in the original Tex82 but have since moved on, at least in code.


aren't pdftex and xetex just patched versions of tex82


> pdfTEX is based on the original TEX sources and Web2c

So it at least ties back to original code. XeTeX appears to be similar but with even more extensions, but other of the "TeX" tools are complete rewrites.


Maybe, but I've moved on to tectonic. Which surely isn't in the same language.


Last I checked (though it's been a while), I think tectonic basically wrapped the xetex code.


Externally visible stuff: definitely. You really need something like TeX or by now Linux and some GNU Tools etc. to stand the test of time.

My very first job was to work on the backend of some software for internal use. You have probably all bought products that were "administered" in said software. When I worked on it it was 15 years old. It evolved over this time of course but original bits and pieces were all around me. And I evolved bits and pieces of it as did others. By now it's been another 15 years and while I know they did a rewrite of some parts of the system I bet that some of both my code and the code I found to have been written almost 15 years before I started there is still around and you and me keep buying products touched by that software without knowing. The rewrite was also done by at least one person that was around when the original was built. He learned a new language to implement it.ut he took all of his domain and business logic knowledge over.

Its even more funny because I knew the guy that wrote some of the code I worked with directly from a different company but I had no idea he worked there or that he worked in that project until I saw his name in the CVS history. Yes CVS. I personally moved it to SVN to "modernize". I hope that was recognized as "technical debt" and they moved to git but I wouldn't know.


I think this is by far the most common outcome for work. A chefs product only lasts an hour at most. Most jobs don’t create anything other than a temporary service.


That’s because most of the things we need are temporary services, of course. You need dinner. And gas to get to work. And a roof that will last 10 years. Etc.


> A chefs product only lasts an hour at most.

A chef's recipe, which is also something the chef creates, may last hundreds of years.


That's a very romantic view of the life of a chef but I'm afraid that by an overwhelmingly enormous margin the output of a chef is servings of food not century-spanning recipes.

The comparison between developer and chef is kind of a stretch but there is a similarity of sorts. It could be argued that the recipes are analogous to the algorithms or patterns that we use day-to-day in software development, and that the servings of dinner are analogous to the applications we build. The algorithms/patterns and recipes might persist for a while, the apps and food have a shorter lifetime.

I'm not advocating for throwaway or disposable code (though I'm not above implementing a quick hack, personally) but I don't think we need to think less of ourselves or our profession because we're producing things which currently have a shelf-life of years or decades at most.


But tbf, that happens once in a billion recipes. 99% of new recipes are forgotten, often after a week or two.


Even some of the most famous recipes can change over time. I’m sure that things like McDonald’s burgers are slightly different now.

Perhaps the most enduring a chef can do is invent a new technique.


The Youtuber Max Miller's channel "Tasting History with Max Miller" has a number of good examples of old recipes for now-familiar foods. His Semlor episode[1] compares a recipe from 1755 and one from more modern times, and there are substantial changes.

[1] https://www.youtube.com/watch?v=0Ljm5i5N6WQ


The chicken nugget and McChicken batter are different from when I worked at McDonald's as a kid. Naturally it was better back then...


IIRC they switch from frying the fries in beef oil to using vegetable oil and the fries have never been quite as good.


When I was young I spent a year framing houses. I reflect a lot on how much longer those have lasted then so much of what I built software wise along the way.


Very true, I'd expect a house to outlast almost all of our software. There's really not much permanence in this industry.

I've built IKEA bookcases that have outlasted most of the stuff I have written.


It isn't necessarily software, but algorithms will likely last a long time. Euclid's algorithm and the sieve of Eratosthenes are still around. Researchers are still developing new ones. Just like building techniques may outlive the buildings that were created in their shadows.


100% agree.

But I like to think that ideas and solutions and products can be legacies.

It's semi-uncommon to write code that legitimately lasts 5+ years.

But it's very common to work on projects/products/companies that last 15+ years.

And I have to be content with that.


Most code that survives 5 years survived for all the wrong reasons. That said, most code that's survived five years is often a profitable product with a reasonable human at the helm telling us needs to not to touch it


> If there’s a software equivalent of a Cathedral then I still haven’t found it.

Probably like, some parts of Windows or Linux, or GNU tools, or some thing like that that, while still being updated, also has ancient components hanging around.


Good point about GNU coreutils: https://www.gnu.org/software/coreutils/

Also: That includes the man pages. They should be around in 50 years.


> I find myself thinking that maybe if I’d been in civil engineering or something then I’d be building stuff that lasts, but speaking to people who’ve worked a long time in construction has taught me that it’s the same there.

Right. People who assume otherwise aren't spending much time browsing the relevant subjects on Wikipedia or historical registers or just paying attention to their municipality. Simple demonstration: look into how many Carnegie libraries that were built are now gone versus how many are still around.


> If there’s a software equivalent of a Cathedral then I still haven’t found it.

Actually you have. It’s HN! The website that hasn’t changed in decades… used by the most tech savvy people in the world!


Consider constructions that have survived thousands of years such as the Tarr Steps or Stonehenge, or more recent constructions such as roman roads, Dunfermline Abbey or St Bartholemew's Hospital.

These sorts of constructions have been repaired and re-set hundreds of times over their existence, and have sometimes gone through periods of destruction during war and natural disasters, disrepair then subsequent periods of restoration and reuse. At a certain point, very little or nothing of the original construction really remains, but you can nevertheless draw a line through hundreds or thousands of years of history.

Software may be more like this: continually rebuilt and maintained, but still physically or philosophically related back to some original construction. Nobody uses Multics any more, but almost everything in use today is derived from it in some way.


Why though? Does it really matter if something outlives you by 200 years? 500? On a not-that-long timeline like 10,000 years, nothing lasts. The "cathedral-like" timeline is completely arbitrary I think.

Imho, there’s freedom in accepting that nothing I produce will last a long time.


Games seem to have a longer shelf-life, or at least tend to be passed around for longer. Some of my Flash games from 15 years ago are still being passed around game torrents and still playable on Newgrounds, and the console games I worked on are still playable in emulators and part of rom collections (and the one physical game I worked on, a PSP game, is still available to buy used on Amazon).

Now, how many people are actively playing those games? Probably very few people. But at least it's still there when people get the urge, or decide to play through a collection.


I agree with your takeaway message, but the timeline isn’t completely arbitrary. From the perspective of humans appreciating things, there’s a difference between something that endures for .01x vs. 10x a person’s expected lifespan.


That is when you are shooting a car into space. ;)


Because what matters is not the number you can think of, but the scale proportionate to human life / needs.

For somewhat related context: all my life until recently I rented, but now had become a house owner. My strategy with things I bought for daily use was that they were meant to survive until the next move. For many things it wasn't economical to take them with me to the new place. So, for example, I didn't want to buy a "proper" pan / skillet, and was happy with a teflon one just because it would wear off about the time I needed to move again. I could just throw it away and get a new one. Now that I don't intend to move, I'd choose to buy things that last, because cost of moving vs cost of garbage cleanup changed.

Now, when it comes to software, it looks like the industry is overwhelmingly motivated by short-term benefits, where, in principle, it shouldn't have been. And this is the surprising part. Lack of vision / organization leads to the situation where we never even try to make things that last. There are millions of e-commerce Web sites in the world, but they are all trash because nobody could spend enough time and effort to make one that was good (so that others could replicate / build on top of it, and also have a good product). We have the most popular OS, that's absolute garbage, both the core and the utilities around it, which is due to shortsightedness and even unwillingness to plan on the part of the developers. Same thing with programming languages, frameworks etc.

So, looping back to your question: why the magnitude matters? -- what matters is how your own planning compares to the lifespan of your product. Many times in my career I was in the situation where I was building an already broken thing from a design that was known to be broken from the start or very soon afterwards. And that had nothing to do with changing requirements, it had all to do with the rat race of getting a product to the market before someone else does / investor's money dries out. The idea behind "things that last" is that at least at the time you are making them you cannot see how whatever you are making can be done better (as in more reliable / long-lasting).

In the end of the day, what happens now in programming is that things that shouldn't outlive their projected deadlines do, or, predictably, die very fast. But we don't have any section in the industry that builds things to last. We just keep stuffing the attic with worn-out teflon pans.

Compare this to, for example, painters, who despite knowing that most of their work will likely be lost and forgotten, still aspire to produce "immortal" works of art, and if a picture takes a lifespan or more to make, then so be it. (In the past, some works of art were carried out by generations of artists, especially, when it comes to books where scribes were succeeded by their children who'd continue writing the book after their parents death).


To your point about buildings, I wish we considered the "technical debt" of our society's built infrastructure more. It seems we went very wrong with this habit of building and rebuilding on such short cycles, especially on large projects that have much longer lasting consequences on the more important infrastructure of our natural ecosystems. All that carbon burned to extract, produce, and transport building materials. All those sprawling roadways built over habitats and forcing people into unsustainable patterns of living, burning more carbon in their day to day to get around. This debt needs to be measured and it needs to be addressed with high priority.


But the thing is, that the ghost of the code lives on. Many times I saw specific database designs, and technical decisions all taken to accommodate a solution that didn't exist for a decade or so.


Core Libraries/Kernels, like LibC or The Linux/NT Kernel.


To invoke the RIIR train, probably a lot of fertile ground in being the definitive Rust implementation of "solved" foundational libraries: zlib, libpng, libjpeg, etc. Something ubiquitously used which has very minimal/no churn. As Rust usage grows, dependency on the original C implementation will diminish.


It will diminish, but never go away until POSIX foundations or graphical programming standards get replaced (most of them defined via C ABIs).

When I was mostly doing C++ and C on day job, 20 years ago, they were the languages to go when doing any kind of GUI or distributed computing, nowadays they have been mostly replaced for those use cases.

Yet they are still there, as native libraries in some cases, or as the languages used to implement the compilers or language runtimes used by those alternatives, including Rust.


Even those change substantially over time, even if they're not directly rewritten, things get updated and relocated.

It's like how the streets in Rome have been the same for much longer than many of the buildings have been standing, even though the buildings are hundreds of years old.


Software Package of Theseus.


>If there’s a software equivalent of a Cathedral then I still haven’t found it.

That one old file dialog window that still somehow shows up in windows 11 from time to time?

Or the Linux kernel.


This is completely random, but you remind me of something that makes me laugh every time I use Google Maps navigation.

Last year I was messing around with my phone's text to speech settings, and I selected a male voice but cranked the pitch setting to the max. I proceeded to forget about it. For some reason when navigating, the voice is still the default pitch. Maybe about 1 in 20 turns at random though, the voice has the pitch cranked up to the max. It's rare enough that my 4 year old and I always burst out laughing when it happens.


I expect few will be using/working on the linux kernel in 30-50 years.


I think it's normal. Some systems I've built were quite sticky in the sense that they started as prototypes to be scrapped once we figure out the "real" system architecture, and 8 years later they're still in use and integrated into dozens of business processes, so hard to replace.

In general, looking back at old code I wrote it seems my solutions were better when I was more naive / less experienced, as I would often go for the immediately obvious and simple solution (which is the right one in 90 % of cases). Working many years in software development and reading HN seems to have made me more insecure regarding software, as I tend to over-engineer systems and constantly doubt / second-guess my technical decisions. So one thing I'm actively trying to do is to go back to using simpler approaches again, and caring less about perfect code.


I've taken the opposite lesson to you, I now keep everything super simple and am quite conservative on adopting new concepts.

It's because I've so often seen the cycle here of "X is brilliant" and then 2 years later "how we switched off X and saved millions of manhours!".


Sounds like you took the same lesson


Only reading the first paragraph is pretty simple though.


So, the same lesson


Definitely my experience as well. My dark horse is trying to optimize for speed when i don't need it, and not doing it when i do.

When i was more naive, i'd just stick with the first solution that worked. Now, i find that i'm always worried that someone is going to try to input a 30000-line csv in my tool and that i must use the slightly faster way because it's the right thing to do.

One of my tools to prevent this has been the non-accepted answers from stackoverflow. I find that they are generally of similar quality to the accepted answer, but that because they are less tailored to the (necessarly) different problem, they fit less and have a lesser chance of being accepted despite being more useful to more people.


> as I tend to over-engineer systems and constantly doubt / second-guess my technical decisions

I find this as well. I also think that there is a sub-conscious fear as you become more senior that you need to justify that with more elaborate/complex solutions. In my experience there are also a lot of people in software who never come out of the other side of that view and constantly equate "complex" with "good". Being in an environment with lots of people like this makes it hard sometimes. Ultimately you need to trust yourself and do what you think best. If it goes wrong, at least you know you did it for the right reasons.


Anyone with young children may be familiar with Peppa Pig, an animated British show. Daddy Pig is an expert when it comes to concrete. A far off land sends for him to inspect their concrete, and he travels by overnight train. In the dining car, his breakfast is catered to better than royalty. When he gets there, he taps a block of concrete with his pen, declares it good, and the king and everyone rejoices.

In many ways, that's what it's actually like when you've punched through and become a true expert in your field. Important problems are solved with an overnight sleep, a cup of coffee, and a few careful taps. By the time you get there you're insane, but at least you're offered both coffee and orange juice if you ask.


I don't know what this means about my personal (in-)ability but I've been developing for years (15+) and I feel like all of the other developers I have to deal with are constantly turning to weird, convoluted solutions to problems. I have also never been able to get past the first step in the google interview process so I have this deep, frustrated insecurity. And people constantly like to point out how simple my stuff is in a tone that suggests that is a bad thing.

My hatred of the complex solutions is so deep that I just suffer through all of the commentary silently.


You probably need to study the Complex, get to know it really well, almost, but not entirely, embrace it, and only then ditch it in favor of Simple.

I mean, it could just be that your simple solutions have been missing something, maybe something crucial, or you lacked the understanding of the mindset of people loving their complex stuff, which presented a communication barrier.


Simple is good, you have nothing to fear (as long as it's not naive/laggy).


I started doing the same thing out of impatience.

Recent example: I have a side project with Typescript on the backend and frontend which also uses an Audio Worklet, so it loads a JS file at runtime to be run in a separate process, isolated from anything else.

I also have a class which I need to use in all three mentioned pieces.

Three years ago I would spend hours trying to figure out how to make this work with the build system so as to not have duplicated code and maybe eventually arrive at a mostly working solution before the e.g. frontend framework maintainers update their build system version which may or may not break something.

This time I just copied the damn file everywhere - the Audio Worklet got the compiled JS output pasted at the start. It's a class, it works and I have maybe one idea what changes I could make there in the future at which point I'm just going to copy everything again.


I think most of us go through that journey.

Can be hard to decouple overengineered messes from Good Software: an idiot admires complexity, et cetera.


Relevant xkcd: https://xkcd.com/2730/


Plot twist: biological evolution works like that too.


That's like calling a chair you built 10 years ago "technical debt" because it can eventually break.

No, it's just a fucking product you made. The fact it has to be maintained doesn't mean it is "debt", it's just like any other asset.

You don't get to your car and think "that's technical debt". There is nothing technical about it. It's a tool with maintenance needs.

The difference is choosing worse now to get it faster, that's technical debt.


I was going to make a joke about how it isn’t all that surprising that a person with this history of language selection would completely misunderstand and misuse relevant technical slang in the English language. But I’m not supposed to sow discord so I won’t.

The author navigates the technology world very differently than I do.


It baffles me it's the only comment pointing this out and it's not even at the top.

The rest of commenters for some reason seem to equate "technical debt" with old technology and "legacy" projects.


People just like excuses to rewrite stuff with new fancy toys I guess.

We have code running for 10+ years in our CM. It's not technical debt. It's well tested asset that continually produces value without much problems or maintenance, coz most of them were already rooted out.

Sure if it was 10 years old blob in language nobody writes in in company it would have been potential liability but assuming every piece of software is that because it is old is a problem.


Many managers or higher-up business leaders do not understand the concept of maintenance, which is why it becomes 'technical debt'.

They just expect the existing thing to keep working in the background and more imporantly do the new stuff that is wanted.

It's generally a debt, because engineers are not given time to maintain it properly. Imagine the car never getting to go to a garage to be serviced, because it's needed for driving all the time. There is your debt.


> The difference is choosing worse now to get it faster, that's technical debt.

Right, but we always do that. We always make some tradeoff to get things shipped faster. And that's fine, otherwise we wouldn't ship things. It's a balance - and the general meta of "all technical debt bad" is harmful to actually building working software.


But way too many times it is used as an excuse. If you are startup, fair enough.

If you have existing consumers, it's not. Spend that extra 20% making code maintainable or not picking the quickest possible solution. Spend that extra 40% on planning and designing feature that will be in your product for next 5 years


thank you.


I don't understand the need for obscenities. Their are children on this blog.


Not using obscenities when fitting is reducing the depth of expression. Some things are done suboptimally, some are done badly, and some are utterly fucked.

And replacing obscenities with silly words is just that, silly

> Their are children on this blog.

Putting children on tech blogs is abuse, they might learn something terrible like JS by accident, that's worse than any bad word they could find


nope; just checked, there are none

(and even the ones that are pretending to not be here said they don't give a shit)


There you go again with your potty moth. I insure you there was a sweary in your OP.


Well, a more accurate comparison would be if you made a chair 20 years ago, and now no one makes chairs anymore because we have all become robots.


If everybody became a robot and there's no need for chairs, then to whom is the technical debt supposed to be paid?

If there are still fleshbags who need chair, the one created 20 years ago, if properly cared for, works just fine.

Now, if I used a piece of rotten wood for one leg, because I couldn't bet bothered to go fetch a new plank, this is debt that will need to be paid in the form of putting additional work to fix said leg later.


Everyone one that has counter examples thinking they prove the opposite, are just not waiting long enough. All the kids think they will be young forever. Can't blame MS for new JS libraries every month, or a new language released every month. This web site is dedicated to people showing off things they created, which replace something that turns into debt, and in turn also become flavor of the month.

Yes, biased old man that saw 30 years of code get replaced with the latest "thing that kind of works like the old thing, but not really as good, but we have to do it to keep up with technology or we'll have debt". I've seen old VB6 apps fulfill a business need just as well as anything written in a full LAMP stack that takes a team to implement. (not that I'd ever use VB6, but hey, it did a job).

Take me out back of the shed and end it quick while I'm hunched over programming something cool.


It’s so weird, I used to be much more of STEM triumphalist, the kind of person who used to think the past was evil and the future can’t come fast enough.

I wouldn’t consider myself conservative by any means but increasingly in 30s I’m beginning to think everyone needs to stop messing with stuff and accept imperfection.


As the adage goes, things that are new when you're a kid are the norm, things that are new when you're a young person are exciting, things that are new when you're older are unnecessary, confusing, or scary.

It's true for everyone in every subject I'm afraid:)


Even when young, new stuff is scary. Look at any community, ranging from RPGs (new rule editions, new meta plots (those are in the example I think of getting worse, objectively ;-)) to stuff like World of Warships (new classes and ships introduced after person started bad). Nothing to do with age, but rather emotional attachment. I try to avoid that kind of attachmemt, especially with regarda to my work. The odd time I stumble across something I didnyeaes ago more often than not I embarassed to thebpoint wantin to deprecate it myself! The rare, even odder, exception notwithstanding.


That's the usual path of becoming conservative as you grow up. Or phrased another way, becoming normal. The training process humans go through leads to a lot of weird biases, just like how RLHF contributes to models getting bad ideas as well as good.

For example, you spend years in school being instructed as efficiently as possible in the one right way to do things that was already discovered. You are only rarely or never shown the messy process of trial and error needed to get there, and many really big errors are hardly covered at all (the history of communism didn't even get a look-in when I was at school!).

Another: you're instructed near exclusively by specialists, who are definitionally always right (if you argue with the teacher, you lose).

Yet another: as you progress through the system, you're rewarded as the ideas you handle get more complex. You aren't rewarded for simplicity. You are also rewarded for bullshitting, because just like RLHF, exams don't award points for saying "I don't know" but they can award points for a lucky guess.

These sorts of systems inevitably lead to an assumption that progress is linear, mistakes rare, complexity intrinsically has value, that expertise is nearly infallible, that guessing is OK if you don't know and the situation seems important etc. The longer one spends in education the further from reality these intuitions become, until you reach academia and become a public intellectual who produces the most complex/radical ideas possible based on educated guessing and then ignores whether they work or not.

Life outside the training environment eventually starts to correct these false ideas. You see how many things are tried that fail, how nuanced and ambiguous the value of ideas really is, how complexity blows up in people's faces due to problems they didn't anticipate and so on. You start to value incrementalism, evolutionary processes, systems that gather and aggregate the wisdom of the crowds. You become less impressed with ivory tower intellectuals who think they got it all figured out in advance on a blackboard. You become conservative, and end up railing against the young radicals who have some bright idea for reshaping society by force.


Well, becoming small-c conservative, to a certain extent, is probably fairly normal, though perhaps not inevitable. Capital-C Conservative, not so much: that doesn't come with age, it comes with wealth.


These things are conflated because it usually takes time to accumulate wealth especially if self-made, so older people will tend to both be richer and more small-c conservative.

Also although in the tech industry this link has been broken/inverted, there's usually a correlation between excessive risk taking (i.e. lack of conservatism) and losing all your money. In most sectors of society wealth is built up over time through care and perhaps a bit of moderate risk taking, but not too much.


Older people being richer is actually a relatively modern phenomenon.

Ref: "Have the Boomers Pinched Their Chiren's Futures", David Willets, Royal Society lecture. https://www.youtube.com/watch?v=ZuXzvjBYW8A


Anecdotally speaking it came with paying taxes not with accumulating wealth (though I wish it were the latter).

My first contracting job where I had to pay quarterly tax estimates (apologies for the U.S. tax-code specific reference) literally changed the way I thought about the world. Instantly financially conservative.

By far the smartest thing the US government ever did was take your taxes out of your paycheck so you "never see it".

BTW this applies to personal taxes not corporate taxes. Still think we're going about inflation completely wrong by raising rates and not raising corporate taxes. But the U.S. has been on a genocidal campaign against the middle class for decades now and we keep electing the same politicians so I guess we get what we deserve.


Not sure why you are downvoted, this is one of the best comments I've read in a long time. How idealism turns into realism and the sweet spot is probably somewhere in the middle.


It's because conservatism whether big or small c, political or technological, tends to be seen as standing in the way of progress by those who didn't arrive at the same underlying intuitions about the difficulty of tabula rasa development. Because the intuitions are so deep, we barely articulate them and that leads quickly to misunderstandings. Conservatism can easily be perceived as errant obstructionism or even the result of a hidden agenda that stands in the way of a better future, if Great Leaps Forward feel easily achievable. Hence why the OP said when he was younger he saw the past as "evil" - a very common way for young radicals to perceive conservatives.

For example, most companies with large software assets will at some point go through a fight between the old hands and younger devs about whether to do a rewrite in the hot new thing. Which side you're on may appear dominated by age, but it's really more about your intuition about the risks of rewrite projects. That in turn is determined by your experiences of people's ability/inability to fully comprehend complex systems.

Also I didn't try to to be neutral, hence the digs at communism and academia. If I was aiming for karma I'd have left those out. But I'm not.


Steven Colbert once jokingly said, "Reality has a well known liberal bias".

It should probably be the other way around, that liberals have a reality bias, but it doesn't quite land. Liberalism is based on what can be proven. Especially in science, starting with some preconceived reservations leads to inaccurate models. If you can't articulate it, all the worse. Even things like categorization and taxonomy can't account for our in-built intuitions, there's always an exception. So, we really must try a tabula rasa type approach. This tends towards a, you guessed it, a world view more constrained by reality, which typically falls left (see, most of academia).

I'd say holding on to heuristics is good, but I don't think we'd be where we are today by not trying to rethink, destruct, unravel, play, squish, open many mental gateways.

If your contention is with Whig history then we might agree. But I don't think there's any surity in being sure.


Trying desperately to stay on technical topic here, I'd say the only places where this dichotomy comes up are in cases that are inherently ambiguous and subjective. If something is genuinely proven then everyone does actually accept it. Conservatives and liberals don't disagree on the energy of an electron or the color of the sky. Disagreement occurs over things that can't be proven: how valuable is spending time on feature X instead of rewrite Y? Is this choice of library really "debt" that needs to be paid down by rewriting, or is it a mature tech whose imperfections should just be accepted as an inevitable part of life? If you rewrite, how likely is the project to create new problems at the same time as solving old ones? How can anyone prove such a thing one way or another ahead of time? It will ultimately always boil down to a complex interacting set of intuitions and experiences which vary between people.

The Colbert snark is illuminating because it gets to the nub of the conflict. It's an attempt to claim that reality, something often ambiguous and subjective, as if it was simple and obvious. Look at how straightforward the world is to comprehend - why, all you need is a few nice and neutral academics who do a few studies, write it all down in a model and you're done! Now you know what everyone, everywhere should be doing. Others look at this aghast and say no, that approach has a history of going spectacularly wrong. The model did not match reality and disaster followed.

In the software space it leads to conservatism of the Joel Spolsky "never do rewrites" variety. That's too strong, but it's a famous and widely cited essay even 20 years later because it represented a rare articulation of the value of conservatism in technology along with clear real world examples of cases where the lack of it led to (commercial) disaster. Fortunately, attempts to rewrite Navigator or Word only led to financial disaster and only for specific companies. When attempts are made to rewrite society instead of software the cost of failure is drastically higher.


> If something is genuinely proven then everyone does actually accept it.

This is demonstrably untrue. Vaccines. Evolution. Anthropogenic Global Warming.


Vaccines aren't proven compared to things like the color of the sky, are they? Who on earth still believes that after "95% effective against COVID infection and will end the pandemic" turned into whatever this week's version of the story is. COVID shots demonstrated unequivocally that scientists and officials will happily assert things with 100% certainty that are falsifiable, and then continue to assert them even after they've been proven false in front of everyone.

And that's my point - if you intuit that the world is simple and progress easy, you become committed to mental shortcuts like "if a professor/civil servant says it, then it must be true". The education system forces this mentality on you, even. When other (usually older) people who don't take such shortcuts say "but actually that's not true" then it turns into a fight over what reality actually is, which is an enormously complex thing. Rather than engage down in the weeds where they might lose, young radicals prefer to just try and shout down objections. This pattern crops up again and again.

Evolution is an interesting one because in that case it's the creationists who have adopted an overly simplified model of reality and don't want to let it go. Same problem, just different groups. I don't personally think the word "conservative" is all that useful for that reason even though you often have to use it, because it just means trying to conserve things, which as a position is neutral with respect to what's being conserved.


I don't think your take on vaccines is accurate. The recent handling of covid vaccines is being miss-understood because it was a fast moving and dynamic situation. There was an effort to provide information as it was being learned, with partial small studies. Knowledge was changing in front of us, and yet scientist are blamed for not being 100% accurate about everything instantly. But that is covid, there are many Republicans that also stopped giving kids measles and polio vaccines and then there were outbreaks. SO, yes, conservatives are wrong, they want to 'conserve' themselves to pre-scientific ways of life, where we ignore things we do know for fact. I'm not sure some of them do doubt that electrons exist.


The colour of the sky is also culturally conditioned. (Ref: Guy Deutscher, Through the Language Glass)

And vaccines don't cause autism. Some things we actually do know.


For better or worse STEM in the English speaking world is still slightly stuck in the empiricist mindframe where inventing new technology and discovering new things by observation is the main method of advancement.

This creates a bit of a blind spot for stuff like mathematics where a good idea can continue to be used and be improved on for ages, or in physics where some of the most groundbreaking stuff in the 20th century was not found by observation but by thought experiment.

There's a lot that can be gained by recognising and reusing the same good idea for centuries, but it is not really 'progress' in the 'discovering / building new stuff' sense.


There's a difference in the modern era with the term "conservative", and you may find the term "reserved" fits what you're describing here better.

I wouldn't consider myself conservative by any means, but I do find myself more reserved in my decision making - I'll take a bit more time to come to a decision rather than firing from the hip, etc.


If you don't want to be replaced you must solve real issues, like centering a div vertically. Joke aside, the good part of LLMs replacing the programming jobs is that programming can become fully an art [1], such as calligraphy or manual woodworking [2].

[1] Was looking at another HN thread, CS 61B Data Structures, Spring 2023 UC Berkeley, https://news.ycombinator.com/item?id=35957811 and in one of the videos, Lecture 27 - Software Engineering I, https://youtu.be/fHEVKqYb9x8?t=387 the professor says "Programming is an act of almost pure creativity"

[2] Japanese Joinery, https://www.youtube.com/watch?v=P-ODWGUfBEM


It's a good thing if the environment around your VB6 stuff can still work with it, or accept it. Many systems get replaced when the substrate they run on is no longer supported as in "won't run on any hardware still in its warranty cycle or with parts available", or everything around has changed and the layer of glue needed to keep the old system alive and talking to everything else has grown to be prohibitively complex and expensive and is now more like a load-bearing structure by itself. If you can replace it and also fire warm bodies who used to keep it alive by being the irreplaceable wizards with arcane knowledge and job security, it just might be a net win.

I hear tales about several layers of emulators the sources for which are forever lost, with the bottommost of them running some very mission critical early COBOL or ALGOL-69 (or Lisp, who knows) code written back in the days when it had been that newfangled thing young guns liked, and which no one now can reimplement today (maybe because elves have died out from smog and the golden age is over), but they may be bunk.


Seeing the cyclic nature of things after 12 years of programming, I am focusing on the solving the domain aspects of a problem rather than code.

I will leave that to juniors who probably appreciate doing the coding.


It isn't necessarily a cycle, a lot of the issue is just a cultural issue with software development. You can change culture, however slowly, it can change.


As I think about all the technologies that have come and gone, it's good to find some sturdy rocks as well. Here are some techs that withstood the test of time in my personal journey:

* Simple, but reliable file formats: My old photos collection is 25 years old (jpeg) and I still look at them occasionally.

* My influxdb/grafana/openhab setup is now 8 years old and has been operational with only minimal hickups.

* Some home automation scripts I wrote have endured for 8 years now.

* My Linux/Unix knowledge is 26 years old (although much has changed since Slackware).

* My C and C++ programming language knowledge is 28 years old (although I rarely use it anymore, because of Rust). Occasionally it is useful during debugging performance problems in kernels.

* My knowledge of distributed systems (stuff like leader election, distributed and centralised algorithms) is 15 years old. It is still very relevant to understand both existing systems and to evaluate new systems.

It is crucial to distinguish the knowledge that stands the test of time from the fleeting trends. And finally then there are the non-technical skills, which pay off even more. Effective communication, negotiation skills, organisational finesse, inspirational leadership, adept management... These are the true constants that support us during our professional life.


Is jpeg really that simple or did it just happen to win the popularity contest big time?


One company I worked at had heavy SQL Server stored procs. I was pretty dismissive initially ("don't put business logic in the database layer!") but grew to understand that really those were the gold. The first versions were written in 1997 and by the time I arrived that code was bulletproof. There were about 4 UI technologies over 20+ years (VB, ASP, Forms + asp.net), but the procs were the same shape with thousands of edits for edge cases. Changing UX for the new hotness was massively de-risked and faster because the procs were solid.

CTO's change. They started replacing it with a "proper" ERP system around the time I left and it's been nearly 6 years of pain. Joel spaketh: https://www.joelonsoftware.com/2000/04/06/things-you-should-...

(As usual: of course you can break the rule, but first deeply understand the risk you're taking.)


Yeah, when I started in this field 20+ years ago, the wisdom I learned was that it was bad practise to tie yourself to the DB like that. But just like you I can see the value in doing it. SQL may be one of the most futureproof languages there are.


Exactly. Code you learned a decade ago is absolutely useful today.

When I started I outsourced SQL to the DBA’s or an ORM. Not since that team! It’s been a semi-secret weapon for me because most people ignore it.


think around 2010 I worked for the one and only payment processor in my country. All ATM tx, POS, online were handled. The back office handling was done with Agile and Java. It was a pile of garbage. The frontend was still a massive mainframe. It had 10k+ COBOL program running it. With comment changelog on top from somewhere end of the 70s. The code review guy was doing this for 20 years and could tell if things would be slow and not scale all the Tx with just reading the code. Until today they never replaced it and is still running on it.

sometimes you don't need fancy new stuff. Just learn the things very well at your disposal. Heck sed, awk and bunch of simple cli tools still work well on Big data sets :)


Interesting how C programming has survived that entire time period in the operating system and embedded worlds, although the spread of multicore hardware running lots of threads in parallel has also led to many changes (but even those changes were beginning to be implemented 20 years ago).

20 years from now, C will probably still be a core language in its niche.


SQLite's achievement of DO-178B compliance could carry C forward on it's own.

Dr. Hipp spent years achieving this, and any reimplementation will suffer his travails, regardless of language safety.

DO-178B means that SQLite can be used in avionics. No other major database has reached this level of code quality, as the database written for "programmers who are not yet born."


This got me curious, so I searched for some info and found this thread on HN: https://news.ycombinator.com/item?id=18039213

Looks like it may not be used exactly in avionics (or maybe most critical parts of avionics), as it isn't certified for the highest levels of DO-178B (and certification for lower levels was done by particular users of SQLite on their own, so the info is not public). Still very impressive.


It was Rockwell-Collins in Cedar Rapids, Iowa that first urged Dr. Hipp to pursue this certification.

They would not have done so if they had no plans or intention to deploy it.

I would be surprised if they were not able to do so.


That's impressive! Is it a specific version and what features or is it a subset?


The certification was attained ~ 2008.

All versions of SQLite released after that time are run through the test harness that maintains this standard.


Yep. C and SQL are the two languages I've used my entire career (30+ years). I still use SQL almost every day, C not as much. Shell scripts and unix utilities are in there as well as daily-use tools.


It is, relatively speaking, far easier to write a C compiler[1] than one for any other higher-level language, and the language lends itself to low-level tasks without needing to use Asm.

[1] There have been multiple articles on HN where a single person has written a C compiler.


lots of compilers have been written by a single person, including compilers for c++, d, turbo pascal, forth, scheme, ml, haskell, ancient lisp, and so on. craig burley got g77 to the point of compiling useful fortran programs before other people got interested. i think graydon wrote the first versions of the rust compiler himself too

it's probably always better to do it with someone else, because you get better ideas and faster debugging, but pretty much all hlls are possible for a single person to implement


Is it interesting? Somewhere along the chain someone needs to know assembly. Either hand crafting it or some compiler exists for some language that is one step removed. At this level my feeling is that there is less desire to constantly change tools.


Don't look, but C is slowly being ripped out and replaced with Rust...


Queue rust apologists…

I think C and its direct descendants will slowly fade away over the next 20 years as new developers want to get away from the legacy of language specifications that span existing codebases. There are so many sharp edges in C that have automatic fixes/detections/etc in newer languages and don’t get me started about multiprocessing complexity in C.


My coworker and I were pair programming some Rust today. We were working on a convenience wrapper whose whole purpose is to reduce boilerplate. So, by its nature there's a lot of internal generic traits, thread safety, etc.

He knows Rust well but software design not so much. I know software design well but Rust not so much. My experience can be summed up with:

let &mut writer = Writer<T>::writer::new(connection.clone()).clone(); // TODO ??? writer.write(*data.clone()).unwrap();

Meanwhile in C++ I'm like

if (!write(connection, data, data_len)) { return false; }


See as someone who knows Rust fairly well.

  let &mut                       -- Essential keywords.
  writer =                       -- Duh
  Writer<T>                      -- Generic types.  Important.
  ::writer::new                  -- Boilerplate.
  (connection.clone()).clone;    -- Relentless cloning is a big problem.
  ... 
  .unwrap                        -- "Consistent names for optional types," is an issue in Rust.  Every module has different jargon for Some(x).
Rust certainly isn't perfect. The borrow checker creates... awkwardness, that requires .clone hacks to solve.

Flip-side is, I'm coding in CMake right now. I've had to create multiple bash scripts for manually deleting my build files and general day to day.

Software is a young profession.

Everything is shit.


I feel like those clones don't get enough attention. At a glance, you don't really know what those clones are doing. Is it allocating, is just increasing reference counter? Makes code hard to understand


clone() makes an immutable, deep copy/recursive copy of the original data structure.

clone() itself makes perfect sense. Why the programmer needed those clones... is a legitimate issue with Rust.

In my experience so far, it's usually to hack around an interaction between the borrow checker and the output of a function constructed using dot syntax.


> clone() makes an immutable, deep copy/recursive copy of the original data structure.

No. clone() calls std::clone::Clone::clone. For many std collections that means deep/recursive copy.

For custom data structures, it can mean anything. It depends entirely on your implementation of the Clone trait.


Asking as someone who has tried learning Rust but didn't get very far, doesn't unwrap essentially mean "this might panic but I don't care to implement error handling here"?


Yes, it would panic. Typically you handle the Result rather than unwrap directly.


I'm sorry but that doesn't look normal. Code smells with all those clone()'s and unwrap(). It allocates a new Writer and then clones it?! "let &mut writer=" isn't right. The new'ed Writer is a struct; why is there a need to get a reference to it? Why is data being clone() and then immediately de-referenced to create a copy of the clone?

A normal writer API would look like one of the variations:

    Writer<T>::new(&connection).write(&data)
    Writer<T>::new(&mut connection).write(&data)  // if conn needs changes.
It's rare to unwrap(), which can cause a crash at runtime, but to handle the result. E.g. to write more data if the previous write is successful.

    let mut writer = Writer<T>::new(&connection);
    let mut written_len = 0;
    written_len += writer.write(&data1)?  // ? returns the err if it's Err
    written_len += writer.write(&data2)?  // or unwrap the result value
    written_len += writer.write(&data3)?
As you can see, "writer.write(&data1)?" is equivalent to the C++ version.


You're interpreting my example more literally than I intended. :-) What does it say about the language, when parody is indistinguishable from poorly written code?

The actual code we were working on involved functions returning closures with mutable captures, so the borrow checker was especially persnickety.


To be blunt, the code you copied out as example is crap. I don't know what you expect how it's being interpreted. It says more about the programmer than the language.

> functions returning closures with mutable captures

That sounds like another bad design, but to each his own.


My apologies if I hit a nerve. I'm just trying to have fun chatting about the foibles of contemporary software development with fellow nerds on HN. I expected something like urthor's post, where they seemed to appreciate the levity of my comment, and responded with some interesting insights. I also appreciated the follow up comment raising a serious concern about the prevalence of cloning and resulting confusion in realistic codebases.


> let &mut writer = Writer<T>::writer::new(connection.clone()).clone(); // TODO ??? writer.write(data.clone()).unwrap();

It's not difficult to arrive at the caricature of baroque generic code if you combine lack of knowledge with miscommunication. The knowledgeable coworker should be aware that a writable object implements the Write trait, and know, or find out, the signature of the Write::write() method. Even fully generic, a function accepting a connection and returning the result of writing a block of data is not too gnarly:

    use std::io::{self, Write};

    fn write_data<W: Write, D: AsRef<[u8]>>(connection: &mut W, data: D) -> io::Result<usize> {
        connection.write(data.as_ref())
    }
No unwraps, no clones. Because they're not necessary here. But someone has to know the language and the idioms. Even your "easy" C++ code depends on knowing that write() returns zero on success, and that integers can play the role of booleans in conditions.*


Eh, zero on failure, I suppose. The real write(2) syscall returns -1 on failure, and a non-negative value is the number of bytes actually written. But the general point still stands.


When I wrote my hypothetical example, I was imagining the return type to be a bool. My apologies for overloading "write", a more representative example function name would have been something like "FooWrite". I appreciate your follow-up!

I totally agree with you that folk writing code need to understand the semantics of what they're building on top of. Humans and now AI models have a great capacity for generating tons source code without understanding the semantics, though.


I’m not sure why the API wouldn’t implement Write for connection so you could do:

    connection.write_all(data)
Where data could be an argument that impls ToOwned or Into so that it would either use the object you pass in (if by value) or implicitly clone it (if by reference).

Basically this looks more like an API design problem than a limitation of the language.


My example wasn't meant to be taken so literally. I'd say there's some truth to there being an API design problem to untangle, though. It's a relatively new language in a repo with dozens of contributors all frantically trying to get their work done. I'm just the salty senior engineer that's getting sucked in before wheels fly off. Stepping back even further, I think there was perhaps an overeager desire to use multithreading, both unnecessarily and at the wrong level of abstraction. That lead to a lot of lifetime management complexity at the lowest, deepest level. Then, folk slapped on progressively higher level abstractions, applying band-aids as they went rather than refactoring the lower layers.


is there a chance that someday unwrap will be visible to the typesystem and we can safely proclaim that "neither this library not its dependencies contain an unwrap"?


It will fade away the same way COBOL faded away, which is to say that it will still be at the core of a great many critical systems in 50 years, even if new critical systems aren't written in it.


Ozymandias By Percy Bysshe Shelley

  I met a traveller from an antique land,
  Who said—“Two vast and trunkless legs of stone
  Stand in the desert. . . . Near them, on the sand,
  Half sunk a shattered visage lies, whose frown,
  And wrinkled lip, and sneer of cold command,
  Tell that its sculptor well those passions read
  Which yet survive, stamped on these lifeless things,
  The hand that mocked them, and the heart that fed;
  And on the pedestal, these words appear:
  My name is Ozymandias, King of Kings;
  Look on my Works, ye Mighty, and despair!
  Nothing beside remains. Round the decay
  Of that colossal Wreck, boundless and bare
  The lone and level sands stretch far away.”


The Palace - Rudyard Kipling

    When I was a King and a Mason - a Master proven and skilled 
    I cleared me ground for a Palace such as a King should build. 
    I decreed and dug down to my levels. Presently under the silt 
    I came on the wreck of a Palace such as a King had built.  

    There was no worth in the fashion - there was no wit in the plan - 
    Hither and thither, aimless, the ruined footings ran - 
    Masonry, brute, mishandled, but carven on every stone: 
    "After me cometh a Builder. Tell him I too have known.   

    Swift to my use in the trenches, where my well-planned ground-works grew, 
    I  tumbled his quoins and his ashlars, and cut and reset them anew. 
    Lime I milled of his marbles; burned it slacked it, and spread; 
    Taking and leaving at pleasure the gifts of the humble dead.  

    Yet I despised not nor gloried; yet, as we wrenched them apart, 
    I read in the razed foundations the heart of that builder’s heart. 
    As he had written and pleaded, so did I understand 
    The form of the dream he had followed in the face of the thing he had planned. 
      
         

    When I was a King and a Mason, in the open noon of my pride, 
    They sent me a Word from the Darkness. They whispered and called me aside.
    They said - "The end is forbidden." They said - "Thy use is fulfilled. 
    "Thy Palace shall stand as that other’s - the spoil of a King who shall build."

    I called my men from my trenches, my quarries my wharves and my sheers. 
    All I had wrought I abandoned to the faith of the faithless years. 
    Only I cut on the timber - only I carved on the stone:
    "After me cometh a Builder. Tell him, I too have known."


"Nothing is built on stone; all is built on sand, but we must build as if the sand were stone." -- Jorge Luis Borges, somewhere.


That's fine. We build code for issues that are here and now. Like most jobs: they answer immediate needs.

The small amount of code that's useful for years or decades needs to be built on a stable foundation and with few dependencies. This probably matters a lot more than the code being a bit messy. It should require as little maintenance as possible.

I have two pieces of PHP code I wrote more than 10 years ago, in PHP 5.1, that I still use. Both are messy, one is outright horrible, but they have no dependencies (the horrible one optionally depends on GeSHi, but there's a runtime check for its presence and has a "degraded" mode for when it's not there). They work and are relatively bug-free (the horrible one in particular is battle-tested). I didn't have much to do to run them on PHP 8.2. A few superficial fixes that were actual issues (and took half an hour to fix). By the way, I can't figure out how PHP 5 was even able to run this code. You could have mandatory parameters after ones with default values. Yuk.


Same here. I have PHP code in production that dates back to 2008. zero dependencies, a few quirks, it runs on PHP 7.something for now, might upgrade to 8 at some point.

It’s ugly, but it gets the job done.

Best thing is that front-facing HTML and JS from 10+ years ago still works flawlessly, even though the UI is certainly dated.

But I guess that’s the same with HN, where the design never really changed at all and just works.


You do realize PHP 7 is EOL for quite a while now[1]? PHP 8.0 is on security updates only. You might want to upgrade sooner rather than later unless you're paying for backports.

[1]: https://endoflife.date/php


Yes, thanks for the reminder.


I must say, I never quite liked PHP as much as other environments, but I keep using it because it just gets the job done.


The fun thing is that, until CS slows down, stuff gets better so fast that we get to reimplement the same thing, but 10x better every 5 years or so.

First I wrote single threaded code code with automatic memory management, then single threaded synchronous with manual memory management, then synchronous multi-threaded, then async, and then async lock free.

Now I am writing async lock free, and the compiler is helping me prove it is data-race-free and memory safe.

Each time I rewrite this stuff, someone hands me 6-7 figures. This is awesome.


More like the hardware gets faster but the software gets slower or stays the same ;)

Lock-free goes way back. Multi-threaded goes way back. Both more than 20 years. SIMD goes way back. GPGPU goes way back.

What is newer-ish are large scale distributed systems. But even that isn't so new any more.


For a lot of this stuff 20 years is basically new in my book. MPI dates back at least 30 years and I still sometimes wonder if the only reason we don't use it anymore is because nobody wants to deal with C. I spend too much time these days watching technology like Spark make things slower rather than faster because nowadays the Java platform (not to mention containerization) is becoming an increasingly efficient way to splat oneself into the memory wall.


> so fast that we get to reimplement the same thing, but 10x better every 5 years or so.

Citation needed. Seems that it just us getting more and more abstracted which can make it easier but not necessarily easier.

The hardest part for the next gen of developers is not having Moore's law to save them from crappy coding.


Right, I worked on a async non-blocking project in C++ in the 90s. It was a lot simpler than any modern project, no cloud, no yaml, no containers, no fancy security. Sure it didn't have a pretty web interface but was great software.


10x worse is how I'd put it. First example that comes to mind: Microsoft Teams.


Teams is special.

You could've cited Slack or Discord. Teams is an anomaly in horrible software quality that *only* Microsoft can manage to produce (also see OneNote (but DON'T see VS Code -- that's somehow really efficient despite being Electron!)).


what exactly is 10x better than 5 years ago? the technology gets better but the outcome gets worse somehow.

most of the programs i use (besides the browser) are 20-50 years old (paste, cut, xterm, grep, emacs etc), almost any new thing i try is horribly slow (slack, whatsapp etc, the mac terminal app, even the mail app is slow compared to mutt)

open a web page on a computer without adblock and look at what we have built

https://www.youtube.com/watch?v=pq7NLMwynYg (funny video demonstrating the state of the modern web)


ripgrep


How do you make 7 figures writing code and not sitting in meetings all day talking about eye gougingly boring business requirements?


> Now I am writing async lock free, and the compiler is helping me prove it is data-race-free and memory safe.

I am writing async lock free code as well, but which compiler is helping you? Rust? As far as I know, it's cutting edge research to figure out a type system that fixes this for you. But maybe i'm not up to date.

> stuff gets better so fast that we get to reimplement the same thing

So you are leaving the API in-tact and just change the implementation with new techniques? Or even better, you just rewrite low-level libraries you are using? Sounds like the ideal job to me.


>Each time I rewrite this stuff, someone hands me 6-7 figures. This is awesome.

I envy you.


Unfortunately users are not seeing 10x better performance with the average application. A bit of the curse of more resources leads to using more resources wastefully.


When I look at team sizes for '80s and '90s programs, and how fast those teams worked, and consider what I'd expect to be the team size for a modern program tackling the exact same features and how long I'd expect development to take, it also makes me wonder just how much these supposed productivity improvements—which are promoted as where these benefits are being realized, since they're plainly not showing up in program & system performance for the user—actually, like, exist.



All of our work is ephemeral. Software development is just more ephemeral than most work. 40 years ago when I first got into this biz I didn't notice this. Now I notice ephemerality everywhere. You need to make peace with it. The things you're working on now probably won't be used or useful in 5 years. An architect can drive by a building they designed and take pride in that building for decades... but even that building will eventually come down. It just happens a lot faster in tech.


Even the goal of doing something that will outlive you is an arbitrary bar: you are "settling" for mere centuries.

Making something that really lasts is hard. My favorite example is the Clock of the Long Now, a timepiece designed to operate for the next 10,000 years:

https://en.m.wikipedia.org/wiki/Clock_of_the_Long_Now


It's so weird that you'd even consider putting Rails in the same category as FoxPro, even as a theoretical.

Rails is kicking ass. A huge number of people are coming back from bloated JS frameworks to realize that Rails just keeps getting better every year.

Hotwire and similar technologies make the argument for SPAs look very questionable.

And let's not forget where we are; over 75% of the raw gross value created by YC-backed companies use Rails. So it seems like Rails is only popular with successful startups.


You don’t need to use a bloated JS framework.

I think TypeScript is important because of how bad JS operators are defined (operator matrix is just plain stupid, throwing error would be a better option), but everything else is optional. Otherwise JS is good enough.

On the other hand hotwire (moving HTML around) costs real latency and money to mobile users with data plans. Since when it’s better than just executing code on the (for most people strong enough mobile) device with close to 0 cost?


I fail to understand how shuttling JSON back and forth is superior to sending the HTML fragment(s) that have changed.

JSON needs to be parsed and in 99% of scenarios, this blocks the main thread.

Once the JSON has been converted to a data structure and passed around through your SPA's state logic and template rendering, what do you do? You convert it to HTML and render it.

No matter how you slice it, sending HTML fragments and updating the DOM is faster and lighter than sending JSON. My app is already displayed and responsive before your app has even started parsing.

The above doesn't even address the complexity and additional failure modes that you take on when you have to handle edge cases and errors, all of which is quite literally reimplementing functionality that the browser and HTTP already give you.

SPA supremacy is a group delusion mind virus. Thank goodness the pendulum is finally swinging back.


I am somewhat amazed at the list of tech from that article, it's like he has a magical knack for picking dead ends.

C was the 2nd language I learned and I'm still using it. The big surprise for me has been javascript - it's so...bad it became good or at least ubiquitous.


JavaScript is that guy you knew who didn’t have a lot of talent but has stuck at it for the last 30 years, so is now embarrassingly doing much better at his thing than you are.


Interesting thought experiment, how much worse would JS have to have been to have been abandoned and replaced?


C and Javascript are both worse-is-better languages, as is SQL. They have known gotchas and quirks and lots of them. But they basically work, they were good enough, and because of that they became so ubiquitous that it's impossible to replace them.


Not ... really. Those languages all became popular and got critical mass because they were the only way to access operating systems-like things. People wanted the value provided by those platforms and had to go through the language to get at them. So they buckled up, tolerated it and promptly spent decades and billions of dollars on creating wrappers, FFIs, transpilers and the like to avoid having to touch the underlying monopoly language.

People wanted browsers. JS came along for the ride. It wouldn't have been so popular if it had lived its life outside the browser


That's exactly how worse-is-better works. You're not doing it well, but you're consistently showing up in the right place and interoperate well with the things people want to use so they are forced to get to know you.


They were both THE languages of systems that became/were important. I agree that they both have the "worse-is-better" quality about them, but C is a much more appropriate language for what it's supposed to do than JavaScript is. The fact that we (mostly) haven't been able to execute anything else reasonably in the browser is the reason why it's stuck with us. C has had alternatives for a long time but stays put, because it's just better at what it does.


I was thinking the same thing. Silverlight? Silverlight??

I just ended a job with a guy who going on about how he was a Flash hotshot back in Y2K days, lamenting "Steve Jobs for killing it." Like, dude, everyone with two neurons to rub together told you it was a fundamentally terrible idea, doomed from the start. Gawd.

I also fell for a few of his picks — Angular was a rough blow — but what a list of lousy bets.


I wonder where python will fall in history - it seems one of the few common languages that gained usage because people like it.


Python is weird. It's a nice language but it was heading for tech debt status along with (maybe) Ruby, then the ML guys went all-in on it and that saved it. But how often do you find new programs being written in Python outside of the ML/AI space? People got burned repeatedly in the 2000s/2010s by building giant empires on the back of dynamically typed scripting languages and they all ended up either doing rewrites into statically typed languages or (when successful enough) funding PL R&D to try and dig themselves out of it.


I use it - and I suspect a lot of other people also - as a replacement for Matlab. I am an electrical engineer and I write short scripts to calculate stuff and hook it up to simulations etc. Matlab has great toolboxes, but is pretty expensive and the language is a bit clunky. Python is just very versatile and has a huge eco-system now that would be hard to replace. A lot of system administratiors also use it for scripting.


Python is great for stuff which could technically be a shell script, but really shouldn't; it essentially replaced Perl.


Which is weird because Perl didn't really go anywhere, and only got better over time. I mean, at this point it's a large dose of network effect, but I'd love to learn how this transition happened in the early days of it.

Edit: I just tried to look it up and as far back as I can find data (which is early 2000s something), it seems that Python has always been more popular than Perl. TIL!


But how often do you find new programs being written in Python outside of the ML/AI space?

Python is pretty huge in most numerical spaces outside of ML/AI, basically anywhere people would have used MATLAB in the past. It is also the go to language in GIS and quite popular in civil engineering.


It's pretty heavily used anywhere there is data. Ie data engineering, pipelines etc.


Brian Kenighan wistfully removed the lex/yacc parser in the OneTrueAwk many years ago, and replaced it with a custom parser (I believe for performance reasons). The OneTrueAwk remains the standard awk in BSD, renewed, but not replaced. I don't think it's going anywhere.

The POSIX standards, flawed as they may be, have incredible staying power. These standards run our phones and embedded, supercomputers, current Apple workstations, game consoles, and are significant in many other places.

Microsoft itself implemented POSIX in Windows from the beginning (likely recognizing it's importance as the former vendor of Xenix), and while this has waxed and waned, running the "wsl.exe -l -o" command on modern Windows will catalog Ubuntu, Kali, and Oracle Linux that are not Linux, but serviced by the Windows kernel's POSIX layer under WSL1.

Applications that implement or greatly enhance POSIX have staying power.

Those who seek code longevity would do well to study it.


Counterpoint: POSIX is increasingly a creaky anachronism required to support a legacy base of old Unix software. Every modern Unix-derivative OS replaces or significantly augments core POSIX functionality with something that works better for modern hardware and software needs, or suffers in some areas if they don’t/can’t.


It can be argued the other way round. POSIX is tech debt and exists in modern machines so widely mostly because there happen to be free implementations of it. My experience from interviewing has been that a remarkably large number of developers actually never interact with POSIX directly these days: they never work with files and never open a socket or fork process trees. They certainly never use UNIX-adjacent stuff like X. Instead they interact with APIs layered on top like NodeJS, Java, C#, Android, Swift, HTML. None of which bears much resemblance to POSIX. Like, if you read about how to work with the network in iOS or Android you won't be directed to POSIX APIs.


I might counter with "adb sh" on Android, and the C behind Objective-C as development tools where the standards are still very much present.


> that are not Linux, but serviced by the Windows kernel's POSIX layer under WSL1.

that's not how that works.

wsl is a Linux kernel running under a hypervisor integrated with windows.


Wsl2 is how you say. WSL1 is a Linux syscall reimplementation in the NT kernel space.


I see and TIL.

Still said syscall reimplementation was not based off the old and long gone NT posix personality (just sayin' so I save face...)


If you examine the wiki, you will find that the POSIX layer of Windows is the first implementation.

https://en.m.wikipedia.org/wiki/Windows_Subsystem_for_Linux

I had known for some time that VAX/VMS was strikingly similar to Windows NT, but I learned a few days ago that the failed Mica project introduced a POSIX interface.

https://en.m.wikipedia.org/wiki/DEC_MICA


You're describing WSL2, but the post you quoted explicitly said WSL1.


But is WSL1 supported by the POSIX layer? I don't think so. I think it uses some of the mechanisms built for the POSIX layer but I think it is a separate "personality".


See my post to a peer; this comes from the DEC Mica project, the cancelation of which directly led to Windows NT.

Windows was designed in part as a UNIX kernel.


Alright, let me be more clear: WSL1 is NOT the POSIX personality of Windows NT. The POSIX personality was very dumb and minimal. WSL1 is NOT that. It is a different thing. And the POSIX personality doesn't even exist anymore.


Look at the wiki for Mica. The goal was for a whole implementation of Ultrix alongside of VMS.

It fell apart in trying to allow both sets of system calls to be used by a single process. This failure is likely a huge reason for Windows NT, as it led to cancelation.

I used Utrix on a Decstation 240 in college.

"However, it proved to be impossible to provide both full ULTRIX and full VMS compatibility to the same application at the same time, and Digital scrapped this plan in favour of having a separate Unix operating system based on OSF/1 (this was variously referred to as PRISM ULTRIX or OZIX)."

https://en.m.wikipedia.org/wiki/DEC_MICA


There is plenty of code I've written that's still out there doing useful things.

I'm sure some not insignificant parts of Windows, Linux, tools, libraries, browsers etc. etc. are fairly old code that just keeps working .. perhaps with fixes and improvements.

Good code lasts a long time. Technical debt is something you are continuously paying "interest" on. Just like any debt, sometimes it's a good thing and sometimes it's a bad thing.


> perhaps with fixes and improvements.

in my experience, if there's no will or budget for a rewrite, it gets virtualized, locked away behind a firewall/corporate network with a restricted set of users, and will basically run forever as long as the ISA and virtual storage is supported by emulation and the business process still exists and is able to support the infra/people involved.

think of it like a zero coupon 100 year bond in technical debt issuance underwritten by the central bank (corporate hq). and highly liquid in the sense that a virtual image is easy to move around.

the good part is it just gets faster and takes up less % resources as hardware improves, and forever-bugs are usually worked around and documented by the people pushing the buttons.


I got a new laptop recently and I hadn't used Windows in probably a decade. It came with windows 11 and I saw so many old interfaces under the hood while exploring that it made me laugh a few times.


Agree with the premise. I currently work with a PM that is simply terrified of introducing any technical debt or "throwaway work". Mind you, he isn't writing or maintaining any of the code himself. It's like pulling teeth getting him to realize that we have to ship some product in order to learn, and some of what we will learn will be what is WRONG with our product and needs to change (aka throwaway work) and that every codebase eventually becomes a legacy codebase (aka technical debt).

I think we've done ourselves a very big disservice as an industry by focusing on this boogeyman.


A better way to think about technical debt is to understand that what WAS technical debt-- the residual tasks necessary to make something reasonably inexpensive to maintain and improve that come from kludging things together under time pressure-- gets FORGIVEN when the whole technology on which a product is based becomes obsolete.

Products don't become technical debt, they merely depreciate along with any aassociated technical debt. This is why I don't like the term technical debt: it implies something that must be paid back. But you don't have to pay back the "debt" on product that will be completely replaced anyway.

When I have a long list of todo items to perfect my work for a client, and then they run out of money for the whole project and end my contract. I don't say "oh no, now I'll never dot those i's and cross those t's!" I say "yay, I will cross everything off my todo list forever."


In the future there won't be any technical debt. You were lucky to live early.

Code, most code, will eventually turn into some sort of sanitised applied maths corpus (with a heavy overlay of "AI" analytical tools).

In applied mathematics you have cultural obsolescence (stuff that we no longer find interesting or relevant, like an asymptotic formula for the Airy function [0]) but the corpus is intrinsically add-only. Once something is solved it does not have be solved again. Its essence is eternal, so to speak.

Think about code development in such as mature state. It will mostly work by pulling together (using AI prompts) logical units from the future version of Rosetta [1].

When we get to that stage it will be hard to add something truly new. Just like it takes a long incubation and a PhD to add some marginal new thing to applied mathematics, it will take effectively trained applied mathematicians to add something minor to the evolving code corpus.

Coding for the common folk will be mostly compositing. It might be productive and even fun, but not quite the same.

We lived through a period of widespread democratization of development. Cherish the freedom of reinventing the wheel in countless flawed ways :-)

[0] https://en.wikipedia.org/wiki/Airy_function

[1] https://rosettacode.org/wiki/Rosetta_Code


>> Code, most code, will eventually turn into some sort of sanitised applied maths corpus >> Think about code development in such as mature state.

I agree with you. That end state is of course all open source. Anyone not working toward that state is making money (understandably) as the higher priority.

I'm still not sure why businesses do so much proprietary customization as opposed to working in an upstream OSS project. I suppose the OSS foundation in those cases isn't solid enough yet.


I don’t disagree with the author, but I don’t worry much about old tech that is no longer useful.

I have worked in the field of AI for 40 years, and we have had a huge trash heap of technologies that ended up being useless except for lessons learned from failure.

I have always been motivated to work for just two reasons: supporting myself and my family, and learning and using new tech. The great fun is in learning new things. That said, sometimes there is good short term work supporting old tech that companies still want to use.


> huge trash heap of technologies that ended up being useless except for lessons learned from failure.

The joke is, even the lessons will deprecate at some point, either when younger generations, who did not learn/understand them yet, will enter the playing field, or because the progress of technology will invalidate the lessons.

Nothing is really meant to stay.


https://phrasegenerator.com/ , one of my first web projects, has survived since 1996 (almost 30 years, wow). It's definitely required a few "tech debt collections" - the original was in ColdFusion, the hot stuff of the time. Then an asp.net / xml phase when that was cool. Now hopefully a boundless future of stability and modernity with python and JavaScript. At least until the world sunsets any pre-4 python versions, and browser JavaScript is thrown away in favor of pure WASM. But seeing as we still support the <font> tag, I've got some time ...


This is what blows my mind about web stuff.

You needed to completely change framework and even languages 3 times in not even 30 years?! Just to keep things maintainable?

I don't feel that it's the same for other area's of programming: desktop apps, embedded stuff. Maybe I'm wrong...


Sure it is.

Win16 -> Win32 (only partly backwards compatible) -> .NET WinForms -> WPF -> WinUI 2 -> WinUI 3 (with of course Java and Electron being in the mix too)

... or ...

Motif -> GTK 1 -> 2 -> 3 -> 4 (all backwards incompatible)

... or ...

macOS Classic -> Carbon -> Cocoa/ObjC -> Cocoa/Swift

The browser is actually one of the more stable environments out there with a lot of the churn being on the server side and optional JS frameworks as people oscillate around trying to figure out the best way to wrangle a document renderer into being an app platform.


A Delphi app written in Delphi 4 in the mid/late 90s would still run just fine on the latest version of Windows today. Win32 works just fine even today.


I used to love Delphi but I'd expect most people still consider code written in it to be tech debt. How many new projects are being started in it today?

Sadly Delphi became debt at the same time Win32 did. Win32 "works" in the modern era only as long as you restrict yourself to opening a blank window and putting a D3D surface in it. Very few apps use it beyond the absolutely minimum level required anymore, because it's just a giant pile of horrendous tech debt. It's so tech debty that Microsoft have tried numerous times to kill it completely; their failure to pay off their debts doesn't make it non-debt however. How many new apps are using comctl32 widgets and WinINet these days? They all ignore it as much as possible and rely heavily on fat wrappers or shipping reimplementations for the rest. I mean that was like 50% of the value of Delphi as a product - it wrapped Win32 in an API layer that made it actually digestable.


A website made in 1999 still work today, the browser is like an OS and the concensus is that you should never break userland, eg. You don't break the web. Then there are as many frameworks that there are developers.


Which is good ... and bad at once. Think only about how nice could be the JavaScript API if we would allow it to break. Or CSS. Or HTML.

But completely agreeing with your point. Underrated quality of browsers.


The company i work for is using a C64 mainboard for controling some production equipment... and there are ABSOLUTELY no plans to move to anything different. Is this system deprecated from a technological standpoint? Yes, absolutely. But on the other hand it is a super stable solution that will work tomorrow in the same way as it has worked the last 30 years (and with the whole retro computing scene around spare parts will be no big issue in the forseeable future).


But it will be an issue _eventually_... hopefully you'll be able to move the workload onto a VM / emulator easily.


Depends on the equipment. Either it is physically dismantled by then, or people find ways to keep it running. No joke, I worked at sites than ran production equipment from the interwar period and, in one case, the early 1900s. If it is profitable, and you are allowed to run it fornenvironmental reasons, it will run forna long, long time indeed regardless of tech (controlers, chips, software) used in running.

By the way, those pre-pre-war stuff ran in the southern part of Germany. So funds and avaiaibility for newequipment was not a big issue in the literal century+ since it was built.


Maybe implementations have much less value than has been assumed and paid for by stakeholders over the decades. If code is a throw-away implementation detail of some model/abstraction then why are we getting paid so much money again? Really makes me worry about generative AI approaches; just as well there is no universal very high level modelling language everyone loves yet. I think this is why working on games is nice; it's not technical debt it's just a toy, maybe everything is.

Our cathedrals are surely [L,U]inux and the C programming language, HTML has done pretty well too.


Don't confuse short-lived for low value. Software solving the right problems has incredibly high ROI, even accounting for high programmer salaries.

We're not getting that money for generating realistic-looking syntax, but for discussing with stakeholders and choosing appropriate designs with an eye for both the bigger picture and details. These are language-and-library-independent things, mostly, and incredibly difficult work.

You may be able to give an LLM directions to do something like what you would have been able to do, but the money is not in the parts the LLM is able to accomplish, it's in the direction you give it.


"Your grandfather was a brilliant man and his peers all praised him. Go ask him about it!"

"Grandfather, what did you spend your life on?"

"I made people click ads. Don't worry, it's all technical debt or deprecated now ..."


This is such a silly statement. You could say it about practically anything.

"Your grandfather was a brilliant man and his peers all praised him. Go ask him about it!"

"Grandfather, what did you spend your life on?"

"I fixed people's shoes. Don't worry, ..."


lol


That's a lot of fad-based tech in this article but you do what you have to do. The first language I used professionally is not the same language I use today. However any one of them certainly could have become "fad" tech... except maybe C, Lisp, and Haskell; those ones seem to be built on foundations (discovered or invented) that are repeated, rediscovered, or reinvented but rarely change.

I generally think of "fad" tech as frameworks and ecosystems that are built around a commercial interest or novel idea. They often fail to overcome network effects. And this leads them into obscurity to await deprecation.

There's a lot of churn that happens in the frothy red waters of "trying to make programming easier/faster/accessible-to-non-programmers".

And we're not so great at maintaining our legacy, the state of the art, the pedagogy and history of our science. Over a twenty-plus year career I've seen people re-invent solutions to the same problems over and over. Each time it's a revolution. You don't want to dishearten the young and eager but at the same time seeing them run into the same problems, learning the same conclusions, etc means we've not been doing a great job at teaching and mentoring and all that.


Don't define your career by the technologies you've used, but rather by the problems you've solved and the knowledge and experience you've acquired.


What do all these things have in common?

MS Visual Basic 6, MS ActiveX, MS Silverlight, MS Visual Foxpro, MS C# .NET Compact Framework, MS ASP.NET WebForms, MS ASP.NET MVC, MS Windows Communication Foundation...

I think there common is a theme there. But shhh, don't tell him.


They're all faster, more productive and richer environments than modern web apps?


VB6 and FoxPro, yes. Others, not so much.


Why not just say what's common (over hyped dead-ends)? Snark don't make you sound smarter.


I thought it was a smart comment. I did not notice the commonality before 29athrowaway pointed it out.

29athrowaway said eight times what's common. Eight times!


Microsoft.


Most of that was replaced by their own .net core and Google's efforts to optimize JavaScript and establish Chrome as the defacto browser, which remains as one of Google's few ongoing projects.


Oh yes… those were for sure the only thing to perish.


The space jam website from 96 is still up :). Standards go a long way.

https://www.spacejam.com/1996/jam.html


your forgot Biztalk, commerce server, MTS, Visual J++ :-)


I guess I was lucky: I started 15 years ago doing web development using python and django on linux, which is still my toolset today ;)


Nice. I was a less smart person who wanted to learn, like everyone does these days it seems (but not back then), the latest and the greatest. My Django projects of 10-15 years are running fine in production and get updates in their respective companies. All the rest I've written has all kinds of issues. And Django is still relevant, just not hip. Should've stuck with it.


I’ve worked on the same website for 20+ years. Recently we were informed that the site was being shutdown and we would be reassigned to other teams. On the one hand, as long as they are still paying my exceedingly well, do I really care what I am working on? On the other hand, it is a bit painful seeing 20+ years worth of work being deleted.


I've been doing this for over 30 years. Not only have I seen years of my work deleted, I've also seen large fractions of the rest of it never even used.


> I've also seen large fractions of the rest of it never even used

These were some of my favourite as a young freelancer (long time ago): they'd pay me $90/hr to write stacks and stacks of code with our small parents-garage team and when we finished, we demoed to the client who were very happy. But instead of having to wait for bugs, fixes and addition, we just got a new assignment after learning 'great job, but this is no longer needed'. Now I find it painful, but back then we could just focus on 'new things' like that. It didn't happen often, but there were a few large ones that I remember well.


I've been programming professionally close to 16 years. Here is what I can remember of my code:

# Job 1 (3 yrs)

  - Worked on around three products, all shut down, code probably lives in some SVN archive.
  - Learnt advanced JS, PHP, MySQL, Photoshop, jQuery etc (skills mostly relevant)
# Job 2 (1.6 years)

  - Project never launched, code never saw the light of the day. Probably lives in some Git archive.
  - Learnt a few in-house frameworks (irrelevant) but also leant Git (relevant)
# Job 3 (8.5 years)

  - Worked on several products, the biggest one is still active and seeing millions of users weekly. Rest got shut down and live in a Git repo.
  - Learnt about some inhouse frameworks (irrelevant). React, React Native (skills still relevant)
# Job 4 (2 years)

  - Actively working
  - Learnt Vue (skills still relevant)


From the article "Swift is another excellent example of how fast development tools change. As soon as Apple released Swift, it was hard to justify writing code in Objective C anymore. I am sure there are some use cases where it is still needed. But Swift is significantly easier to develop and a major evolutionary step forward.

I would argue that any apps written in Objective C are probably technical debt now"

I switched to Swift a few years ago after many years of obj-c, at first I was reluctant as there were still many things I liked about obj-c but Swift won me over. Thought I would never touch obj-c much until I had to integrate a cpp library, I can't believe how much I forgot in such a short time. It was interesting to find out how to structure the obj-c code to translate nicely to swift though.


a thing most of these have in common is that they're proprietary, so the users are dependent on the companies that own them for enhancements, bug fixes, and ports to new platforms. this is a recipe for wasting your time

visual basic, asp, coldfusion, foxpro, activex, flash, and silverlight, windows ce, asp.net, webforms? proprietary, proprietary, proprietary, proprietary, proprietary, proprietary, proprietary, and proprietary

and mostly pretty dead as a result

how about the non-proprietary things in the list? html, css, js, fortran, java, ruby, and rails are all just about as alive as they were 10 or 20 years ago, if not more so, except that rails didn't exist then

the exceptions are perl, objective-c, and the js frameworks. perl, ember, and backbone aren't going to disappear anytime soon but they will likely continue to stagnate. but unlike silverlight or windows ce, you can probably run your perl and backbone and angular and react and swift code 10 or 20 or 30 years from now on whatever platform people like then

unless the platform is centrally controlled by an owner who forbids it, so try to avoid platforms like that

(java applets were already dead 20 years ago, and soap sucked from the beginning, so these examples are out of place)

there is certain knowledge with a very limited half-life. but hopefully you aren't spending most of your time learning react apis or wasm instructions or editor keystrokes or chromium bugs, but rather general principles that transfer across domains. algorithms, reasoning, type theory, math, writing skills, hierarchical decomposition, scientific debugging, generative testing, heuristic search, that kind of thing. a lot of that stuff goes back before computing

and most code has an even shorter lifespan than the knowledge we use to build it. which is as it should be: most code is written to solve a problem that won't last decades or even years, but it's still profitable for companies to have it written. modifying a big system is harder than modifying a small one, so it's better to maintain just the code you need for today's problems. writing code and throwing it away is mostly fine

still, i've spent my career on free-software tools, and their half-life seems to be a lot longer, about 25 years. i reviewed some of the stuff i'm using right now in a comment on here 10 days ago, https://news.ycombinator.com/item?id=35829663

i know a guy whose preferred programming editor is ex. the non-full-screen version of vi


> whose preferred programming editor is ex

It's intriguing, because that implies he can work on most things in a way that "fits inside one's head."

Do you know if he extensively calls out to external tools, custom aliases, and the like?

I imagine there is some fullscreen mode where he can--temporarily--suspend ex, make use of custom sourced mappings, and the output is piped back into his editing session.

It's hard to find certain tidbits, so I'm always glad for even the smallest details.

Thank-you.


I'm the ex(1) troglodyte in question. I don't use custom aliases, because I have had all editing tasks hardwired into my brain for decades, ever since I dropped Teco for ed and then ex. (Ed is the standard editor, but I'm willing to trade off a little standardosity for added convenience.) If the te command had been widely available, I might have stayed with it and not used ex, but that day has passed.

I do pipe stuff through various commands more or less constantly. I most often pipeline through awk, sed, sort, uniq, cat -n, etc. I have a back-burner project to write a meta-editor that works by pipelining and maintains a whole persistent undo-redo history.


i was wondering too! thanks!


Not very convincing. The world is full of abandoned open source projects and companies that maintain proprietary platforms near indefinitely. Angular 1 is dead, Python 2 is dead, Perl is dead, GTK 1-3 are dead, old KDE is dead, X11 is dying, but you can still buy supported versions of Delphi, COBOL and even VB6 is still somewhat maintained by MS [1]. ActiveX was killed by browsers (open source), the underlying proprietary APIs are very much not dead.

[1] https://learn.microsoft.com/en-us/previous-versions/visualst...


it's not at all difficult to run code written for angular 1, python 2, gtk 2, gtk 3, or x11; you don't even have to port them, you can just run them, and i do every day. it doesn't even require any effort

contrast that to applications written for silverlight, activex, or flash

with respect to gtk 1, you sort of have a point, but gtk 2 was released in 02002, and it's usually not hard to port gtk 1 apps to gtk 2 (and almost all of them have already been ported). and i'm not sure what you mean about 'old kde'

free software licenses provide strong legal protections to users to do whatever is necessary to maintain and extend the software they depend on and share that with others, and in practice they do, and the consequence is that the hype cycle is much slower and less destructive


I have found my doppleganger. The exact same technologies, progression, I even wrote a VIN scanning app for car dealers myself. My mind is blown.


I will say that the core idea of SOAP/WCF services is honestly pretty sound. There is no sensible reason (imo, anyway) that everyone needs to be using stringly-typed data and manually parsing REST responses when the underlying code all has type definitions. I get that it came with headaches (and I suppose people wanted to work in untyped languages) but jettisoning the entire idea of generating the client feels like it was throwing out the baby with the bathwater.


But you know what happened to SOAP? It was smothered by its own success. The allure of interoperability and extensibility attracted everyone and their cat to the party. With so many stakeholders involved, it became a classic case of "too much design by committee." It's like trying to paint a masterpiece by having a thousand artists contribute strokes—chaos ensues.

And let's not forget the influx of junior developers during that time. I mean, we can't blame them entirely, can we? SOAP standards were complex and enormous. It's no wonder they struggled to grasp the underlying paradigms. We had an army of fresh faces flooding the scene, and the sheer complexity overwhelmed them.

So, SOAP ended up being the baby tossed out with the bathwater. It had its merits, but the challenges it faced were just too much to bear. Still, it's worth reflecting on its strengths and the lessons we can learn. Maybe someday we'll find a way to strike a balance between the elegant core idea and the practical realities of implementation.


It ended up being so complex, that none of the implementations were up to spec, and hardly any 2 implementations talked to each other out of the box. i.e. the exact opposite it was meant to do.


can confirm


We are pretty much right back where we started with OpenAPI. You can generate a client (or a server) from this YAML file that a lot of frameworks can generate for you from your code, and the file also describes the types. I wouldn't say it is much simpler though, the schemas can get super complicated (as can configuring the darn API Gateway people inevitably think is necessary).


It's not like people don't find a way to screw up REST semantics and make that complicated anyway.


Absolutely. The lasting ideas are KISS and Occam's Razor.


Sure but there's such a thing as too simple. It is distressing to think of the number of manhours wasted on each and every application using JSON having to have its own standard for handling dates since that was one of those things too complicated for the spec to have an opinion about.


I totally agree. JSON at the application level is too simple of an abstraction, leading to unnecessary complexity.


> Ruby on Rails is in jeopardy of being added to this list. It has fallen out of favor, and it is tough to find developers for it.

Ruby is the 8th most popular programming language. Ahead of C, C#, and a bunch of "cool" languages like Scala, Kotlin, and Rust.

Source: https://madnight.github.io/githut/#/pull_requests/2023/1


I've seen things you people wouldn't believe... Attack ships on fire off the shoulder of Orion... I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain... Time to die.


> printing from the browser was its own fun nightmare

Remember Crystal Reports? Oh god, the pain. The pain.


I used that with classic asp as well as with ColdFusion :-) Eventually SQL Server Reporting Service replaced that in places I worked


Remember? If only.

I work outside of tech. We still use it.


Yep still kicking in the water industry in Australia. Glad I left that behind


I got hired for a job specifically because I knew crystal reports! I don't miss that.


I got hired for a job because I was willing to learn to program on OpenVMS. I'd never heard of it, but asked the interviewer "it's a lot like UNIX, right?". He looked at me, waited 5 or 10 seconds, and said "no." Oh well, I passed their C and SQL tests, they weren't going to let me go that easily.

I'll probably retire when they finally retire it, but at the rate they're going I may beat them to it. I won't miss it, but I won't regret it either.


This is largely why I have tried to build as much as possible with OSS and contribute stuff as necessary to make that possible.

OSS tends to have a much longer relevant shelf-life and experience with it, especially internals remain highly transferable skills.

My work in and around Apache and CNCF ecosystems has been the only code I have written that truly endures in a good way rather than an ossified and decrepit legacy way.


I've been down on myself about this. I've been at it for coming up on 30 years. I've written scores of programs and applications. There are only a couple still in use. Rather than be upset about it, I find that it's more helpful to think of coding as tooling. As in, actual tooling that machine shops make: dies, jigs, gauges, etc. You know, the stuff that's used to make the actual stuff. Tools go out of date. Tools get upgraded. Product lines change. Tools get scrapped. It's all ephemeral. It's just the nature of it. And this applies to "the bigs" as well, like Facebook or Google. The individual components and services are just tooling in service of the overall product, and rollover and change as time goes on. The only coding that ossifies into archeological strata is COBOL on mainframes.


Over time I've come to realize that the various "IT things" in business which retain the most value are databases. Not because they never become outdated or redundant but because the data in them can always be migrated to a new schema to be used by a new application. Usually the "tooling" to do this is exactly the kind of work I might do which becomes something that is no longer in use.

(I guess here I should mention the context that I'm not coming up on 30 years; instead it's a little over 10.)

The "no longer in use" part is something I think I ultimately disagree with. It's kind of an "application of Theseus" situation. Where did this data really come from? If it was an older application, did that application ever really go away or did it just become what replaced it? Anyway, I guess I just have to hope I still have this outlook in ~20 years.


Interestingly enough, I've got a thing for "ancient" technology.

My dream retirement job is to work at a tax agency, like CRA or IRS, and help maintain their mountain of COBOL. I don't know COBOL, I don't know the ecosystem around it, but I absolutely know that I would love to learn it.

My first job was in a similar environment, supporting a homegrown application (which had grown out of a long-defunct commercial application) running on a Pick-style database system (UniVerse). The whole thing could trace its roots back to a Prime mainframe. Reading the code, especially the older stuff, was such an adventure.

ScarletDME[0] is seriously scratching the itch to play in this world.

[0] https://github.com/geneb/ScarletDME


Which is better/worse, the hack with a plan which is removed promptly or the barnacle that persists forever?

The barnacle did provide more lifetime business value after all.


Barnacles can cut you pretty badly if you're not aware of them, and they have a habit of reproducing. When you have a large colony on the hull of your vessel they impose a nontrivial amount of drag.

(I'm more committed to the bit than this position though, it's a judgement call that an engineer must make relative to the requirements and resources available.)


I write a project using primarily a foss stack which has a 20+ year history of being committed to backwards compatibility, and I couldn't be happier.

Utilizing the Lindy Effect to my benefit, I've been able to almost entirely avoid the typical framework frustrations, such as breaking changes and abandoned dependencies.

Instead, I've been able to focus on features, figure out a good code style, and support nearly all mainstream or once-mainstream browsers. On the back-end, I'm working on the Windows install process, but on *nix it's fairly uniform across different flavors and lineages.

For those curious, it is Perl, text files, PGP, HTML, CSS, low-sugar JS, and a little bit of sh, Python and PHP for server glue. Now probably to include batch files...


From my few years in e-commerce I noticed this kind of pattern: - you(your company/employer/etc) have a problem.

- you solve said problem

- profit

- competitors catch up and hit the same problem

- 3rd party (or one of the 1st parties pivots) notices all of you have the same problem. implements solution to fix the generic version of the problem: a standard is born

- hubs appear that make it easy for your competitors to eat into your market share because they now use the 3rd party solution and are more agile.

- you can't integrate because your solution is not complying to the new "standard"

- rewrite is needed

(edit: I hate HN formatting)


Am I reading this incorrectly or author is equating "using old/boring/forgotten techs" as technical debt?

How did technical debt become so bloated and meaningless? Isn't it "remaining half baked/incorrect code, known edge cases, bad/slow implementations, or even bugs due to constraints imposed on the engineering team"? How is dead technologies are imposed constraints?

Author is not talking about technical debt but experience that is not directly applicable anymore.


Lot's of "near-RAD" tools with large runtimes and platform/environment/framework dependencies in this list.

The typical thing some of us tried to avoid while others (or their employers) totally embraced these comfortable (advertised as productive and fast) and often proprietary tools. The contrasting would be HTML, JS, C/C++, all closer to the bare metal of their respective targets. Or Java and Python as the two that made the race in more complex but purely language levels, as Rust is doing right now. Java has gotten connected more and more to its typical business frameworks these days, so beware :)

Some of them died just with their platform as ObjC/Swift would die with Apple, Kotlin with Android these days.

It seems a split between language and environment/frameworks has proven to provide some stabilty at least.

Seldom (and funny) to see such a "consistent" list, though :) There might be something like "consulting work" as the recurring theme in it. As a product/system/platform developer one might have had (and would have been forced to) put more effort in selecting the tools for long term availability where they are not forced upon you by the platform or time-to-market considerations.


I started with 6510 assembly. I have 25 year old delphi apps. But nodejs and go hypes are also 10+ years old. If you haven't learned, maybe you are stuck on purpose.


I kinda find the premise here to be flawed. To me, technical debt is stuff that you do intentionally that you know is bad and will have to be fixed at some point, but you've decided the trade off of shipping sooner is worth it.

Sure, all code rots, frameworks and even languages come and go. That's not technical debt, to me, though.

Put another way, if everything is tech debt, then there's really no point to having the term at all.


When I look at the list of "technologies" the author invested in learning ... no surprise it's all technical debt or deprecated: all the things he lists are completely superficial and have obvious short-term, "I can make money now with this stuff" flavor to them.

Had the author invested in learning stuff like math, physics, cryptography, advanced data structures and algorithms, functional programming theory, 3d programming, compiler theory, database theory, etc...

None of it would be obsolete today, and all of it would be almost immediately applicable in any language that happens to be the flavor of the day at a given point in time.

Let that be a lesson to all of the folks who become extremely proficient in the latest react-like fad javascript framework: in 10 years time, all your knowledge will be useful for one thing: maintaining and patching old crumbling code piles that no one wants to touch.

Do yourself a favor instead and go learn timeless things that will still be completely relevant 20 years from now. Then spend a minimal amount of time learning how to apply it in whatever language/framework of the day is fashionable at your job.


Think of the hours lost building code that becomes useless. This is one of my greatest regrets in writing code. I blame the likes of Microsoft that have consistently created bad frameworks that locked out other options and designed for their own buggy operating systems. This is still happening, one of the reasons I do not use Microsoft. Hang on I lie, I am using visual code on my Linux box at the moment.


All my desktop GUI software since 90s was / still is done in Delphi (Lazarus for multiplatform). The language / IDE grew but remained compatible and I've never had to throw away the code.

C++ merrily carried my backend solutions over the same timeframe with the same results.

Same for C when writing firmware.

Same for JavaScript / HTML for browser based front ends.

In all of the above I used some domain specific libs but stayed away from big frameworks as those come and go pretty fast.

I consider neither as a tech debt as they let me concentrate on the product rather than dwell on what tech do I use. I've never felt inferior for not using this new and shiny doodad as I've always delivered superior products and that is what mattered for my clients.

Yes I had to program in whole bunch of other languages upon client requests but this was rather rare. I have good track record in creating new products from scratch and that is what my clients really want. They mostly do not care what I use for development.

So no. I do not feel that tech debt at all and I still play with other tech a little to stay current in case it is needed by client.


Whole BSD's /bin directory disagrees with this thesis that everything eventually becomes technical debt.

If you pick new technologies, don't be surprised that they will be quickly overridden by something else. Pick stable and boring frameworks and languages. I.e. lots of old Java Swing applications runs on today computers without recompilation.


Years before this guy got started, I was working in C. Then I moved onto more C, C++, then back to C. Today I'm working in a C++ codebase that compiles to C.

Lisp on the side, plus the Unix cruft that goes with build systems: shell, awk, make, ...

I went through the language churn as a kid. BASIC, assembly languages, Pascal, Modula-2. Studying CS pulled me into C world; all the upper level coursework at that time was systems programming in C on Unix, whether it be compilers, networking, distributed systems, operating systems or computer graphics or what have you.

I didn't think I'd be cranking out C for another 30 years after that, but I also didn't think of any reasons I wouldn't be.

Not everything I worked on is around; but the skills left behind are entirely relevant. There is hardly any technique I ever used that is inapplicable.


For much of my (35+ years) programming career I have worked in Delphi/Object Pascal. When I was first introduced to it, it was the new hotness, but the last Delphi job I worked on was supporting a 15 year old legacy product.

On the other hand I'm currently working on Android, where everything seems to be obsolete within a year...


The technology might have aged, but the initial concepts used for that technical design are still there and relevant.


> Ruby on Rails is in jeopardy of being added to this list. It has fallen out of favor, and it is tough to find developers for it.

Hardly.

> What once made it unique is now available in other languages.

Sure, yay open source.

You still can't get as much out of the box with one CLI command for a new rails app in node.js. I would love to be corrected on this.


"In the long run. we are all dead". But I find it hard not to look at software and think how ephemeral it all is. All those late nights and all that debugging and it has mostly disappeared without a trace within 10-20 years. And that is the stuff that even got used at all. :0(


Don’t stay up late. Work during normal business hours, and don’t stress too hard. Almost none of what most software developers are doing really matters.


PHP.

Still works and the codebase is still being incrementally updated ~20 years in.

Stint with Laravel, which was a huge task to add, and an even huger task to eventually remove; After an update by Laravel that was not backwards compatible made us rethink the cost/value.

Javascript on the frontend, with no libraries as critical infra except for HTMX.

Was once a contributor to Mootools; now use quite a bit of homegrown funcs, and use libraries for spot instances (Alpine, Uppy, QuillJS - all of which are written modular with intention to be traded out.)

MySQL and eventually Postgres. "NoSQL" only when it was added (as JSONB) to Postgres.

All of those years of coding are an asset, not a liability.

Don't go chasing the new and shiny, it doesn't last. (And steer clear of Web3, will ya?)

EDIT: We use Tailwind - and time will tell if that is a mistake.


> Ruby on Rails is in jeopardy of being added to this list. It has fallen out of favor, and it is tough to find developers for it. What once made it unique is now available in other languages.

I beg to differ. There are 1000+ Rails developers actively looking for work on railsdevs.com


Yea Rails seems to be having a bit of a renaissance moment right now


I wish more mid career devs understood this. All those PR review battles over commas in comments are mostly wastes of time. Entropy comes, it's one of the few guarantees we have. Your legacy will not be the software you wrote, but the impact that software had.


Skills I learned and are still valid today:

- SQL - Terminal - OO programming travelled pretty well from C#/Java to Objective-C to Swift

Then there's stuff that is still called the same but completely unrecognizable compared to the time I was good at it like HTML / JS / CSS


The lessons I've learned from all my programming projects are:

1. If the codebase would still work on a "dead platform", but you are targeting something new, you are chasing after hype, to some degree.

2. If the code involves significant data entry, you probably want to leverage a spreadsheet, because eventually you will want to edit something tabular and also make charts and reports. The things "dashboards" do are also very reasonable to do from inside a spreadsheet - make a tiny shim to pipe in some data, and then make the dashboard frontend using all the built-in goodies.

3. If the code is mostly about making custom UI, you are on the path to saleable application software, or at minimum, a tech demo that people will talk about.


I made a whole bunch of things that have not been deprecated or tech debt. It's about which technologies you use.

C is not deprecated. Maybe in a 100 years it will be, but it seems unlikely.

It's very rare to find someone who has C as their favourite language, but it's incredibly persistent.

C is the opposite of the latest JS framework.

I regularly fix bugs in open source, where when checking when the bug was introduced the trail often runs cold in the mid to early 90s, as it predates that project's use of source control.

But yeah, if you have a 20 year career of following what is clearly the latest fad, then you'll have that experience.

Java applets were never "big". They were a fad, and promise of big. But they always sucked, even by the standards at the time.


> It's very rare to find someone who has C as their favourite language

Hello, I visit here every day. Nice to meet you


You should make friends with Steve Gibson (https://www.grc.com/). He writes all his code is assembly. :-)


Don't worry, I personally know plenty of devs whose favourites are assembly, C, C++, Pascal, Perl... even BASIC, Fortran, COBOL or Forth!

And surprisingly enough, among them, even the old farts aren't so old.


Bah, as someone who until recently had C++ as favorite language (now Rust), I'd say C++ doesn't belong on that list!

But so say probably all those people too, about "their" language. ;-)


I still maintain apps in Objective-C, but I don't view them as technical debt. I find it mostly fun to maintain the knowledge.

One day the apps (products) may themselves become technical debt, but that is not the fault of the language they were written in.


I predict WebAssembly will eventually overtake how front-end development is today, and a whole new world will evolve.

I have heard this with great emotion for at least 6 years now and nothing. It’s starting to look like bitcoin: all these dreams and ideals that come to nothing in practice but a Ponzi scheme.

The funny thing is that it clearly indicates nobody know what they are doing. People advocating for WASM want DOM bindings because the DOM is great and JavaScript is the great evil. Most people who actually write JavaScript professionally feel equally about the DOM, a great evil, and so they are completely reliant on some massive framework to justify the their existence.


Sooooo then LISP for growth, profit and staying power ?

Sorta like how LISP might be the "Guns & Roses" of coding; Not everyone like it, but in their hearts they know it has greatness on some level. And no one can argue with their staying power :D


I left a company once due to some reorg/internal politics while we were in the middle of developing a new system from scratch. 6 years later, I was going to leave my then-current job and I got some interest from my old team (that was by that point made up of almost entirely different people) where part of my job would be... decommissioning and migrating tons of people off the system I was developing 6 years ago! I also had expertise in the new stuff they needed but I thought that was funny and also I think it helped me negotiate a good offer...


Kinda a weird article to be honest. It seems less like the author is annoyed at having to learn new languages or that their language specific knowledge because less useful as time goes on, and more that they are frustrated their code doesnt live on forever.

I guess I don't particularly care if my code lives on or not. I build something. It's hopefully useful for others or myself for some period of time, then its gone.

It's just lines I've written. Why would that be more important than other lines I've written? In other careers I never thought about how my work was eventually going to be forgotten. That's just how life goes.


I felt the same way.

If you want to build something permanent and long-lasting, go build bridges. It does seem kind of silly to expect software to be around for a long time.


I've only been programming professionally for 15~ years and I fully agree with the article.

In my career I've gone through so many tech stacks, most of which are dead or dying today. A lot of my projects you could today replace with a couple API calls or an AWS product.

At least I like to think I learnt a few things along the way that make me a better developer, I never know what to say when recruiters or interviewers ask me about my tech stack experience expecting some sort of very specific answer – to me it makes as much sense as asking whether I have firefox experience, or just chrome.


This is why I stubbornly refuse to use Typescript.

Javascript is an ECMA standard. I'll just do the JSDoc hack and cheer for TC39 rather than waste my time with a technology that's great for now but will probably end up being technical debt.

Think of how little somebody who uses C, bash, vim, and javascript has had to adapt in the past several decades, all because they avoided technologies where the buck stops with a community rather than a corporation.

Corporations are cool and all. I'm not a dirty hippie. But at the end of the day they don't make money when things don't change.


Element of inevitability of course, but I have to look at those tech choices - largely from the MS walled garden - and wonder whether that was a factor as well.

Do walled gardens not last as long I wonder?


Apparently, we’re making either Tibetan mandala art or Theseus’s Ship


Insert Pirates of the Caribbean meme. "This is without doubt the worst ship that's ever floated." "But it does float." There's no inherent shame in Ship-of-Theseusing something to accommodate changing requirements! Complete rewrites are hard and take forever!


Is it just me or somehow programming languages and frameworks are not my focus any more? I'm still interested in programming language designs and implementations, but I'm quite neutral on what language/framework to use in my work. There are just too many other things to take priority: organizational structures, system architectures, emerging technologies, new hardware, algorithms, insights in a domain, and etc. Choice of languages barely moves the needle.


Don't get attached to your code, ever. You are not building the Pyramids of Giza, no one will study your work 3000 years from now, or 15, for that matter.

ALL of your code will be dead in a few years. The company goes under, or some punk CTO comes in and commands a "full rewrite ASAP!". Even with normal evolution and refactoring, your original work will be unrecognizable.

Either way, you take experience and friendships from your employment, you are not going to be the Picasso of software engineering.


You forgot to put "\s" here.

But I appreciate the Picasso joke. Too many devs try applying over-engineered abstraction to simple problems.

------

From this place I want to greet one such "Picasso" whose code from over 20 years ago I was debugging last week... I know your name...


> I have projects at my company in the old version of Angular that is a major technical debt we must upgrade.

My favorite job -- to refactor something old. If only businesses wanted to pay for it.


It's mine too. But it really depends on the time that has passed. Currently I'm redoing a 2 decade old project; it's not refactoring as that would be basically impossible. It's just a rewrite. Which I like too, but less, as now I need to really dive deep in the business side of things as decisions need to be taken to properly incorporate all the crap that was taped on over 20 years. Or, worse, make a whole new project. Then i'm out.


I used to feel bad about how much of my career was waste. Then one day I met a retired hardware engineer, a old hand who had rubbed shoulders with Shockley, who told me that 80% of the projects he worked on never even made it to production. Four-fifths of his hardware career was waste!

If it's that bad in hardware land I don't feel so bad about this virtual detritus we software people spew. At least it's cheap, eh?


This is why I'm particularly proud of my contributions to (web) standards. There's a good chance that they'll be my longest lasting contribution.


The stickiest code I ever wrote was an open source library released in 2000 or thereabouts. It’s still in use, and still new users are adopting it. Someone made a Wikipedia page for it. I think it’s embedded in most Apple products.

Other than that, yeah, most of what I’ve made is no longer used by any significant number of people.

Not sure if there’s any wisdom to draw from this anecdote, but I did spend a lot of cycles in trying to perfect that library!


The idea of leaving behind a lasting legacy of work is vanity squared. The ones who do selfless work for open-source do not necessarily do it for a place in history. The point of choosing a purely materialistic software career was to have the ego humbling of watching one's work get decimated or trivialized in due course.


It's harder now. Way back when you could write ls and it would basically be a "done" piece of software to be distributed everywhere. More recently the big standards are enormous projects involving thousands of people, such as Docker or Kubernetes, that seem to have fairly unbounded feature lists.

One exception: I think the jq tool could stay viable, if it made it into a standard toolset. It solves a defined problem.


I find it interesting how tech that mostly comes out of enterprises tends to have a shorter shelf life than the "more open" options.

> Basic, Silverlight, ColdFusion, asp...

I'd add > Lingo, Flash

Basically if Microsoft, Adobe or any other entrenched player make it easy for you to develop in it and they offer the only Dev stack then its probably going to die quickly.

Counter case, apple seems to be doing fine with swift. So I might be wrong here.


>Basically if Microsoft, Adobe or any other entrenched player make it easy for you to develop in it and they offer the only Dev stack then its probably going to die quickly.

.NET and C# are heavily used, so is SAP and other corporate proprietary stuff. Enterprise customers prefer entrenched players that own the whole stack.


Agreed. I also think it sort of makes my point. Afaik the .Net ecosystem started getting significantly more traction once they started opensourcing it and giving control to more open bodies.

I don't know why you would bring up SAP though. I've dealt with SAP deployments in the past its usually the type of the project where you have to integrate through some arcane protocol due to reasons(TM) and the end results resembles less a live connection and more a sneakernet


If a business lasts long enough for your code to be replaced, that’s a success. The code you wrote helped your company beat the odds and stay alive.


The other day my friend from a previous job linked me to a PR I did 7 years ago, which he accepted, that introduces a regression which went unnoticed all this time and that is probably the oldest piece of code I've written that's provably in use today.

The whole project will eventually disappear in history in favour of a rewrite which is around since before the pandemic.


> I would argue that any apps written in Objective C are probably technical debt now.

That would probably include the lions' share of system code in the Apple OS ecosystem.

I'll bet a lot of it hearkens back to the NextStep days.

Since it is a UNIX OS (all of the OSes), then there's plenty of good ol' ANSI C, as well.

I'm almost positive that every AAA app is still a ObjC BoM (Ball of Mud).


> All code rots or gets replaced

> Over time, you can see how almost everything you create gets scrapped and replaced for various reasons or is now based on old technology.

Is this problem fixable by using some great technologies? React code does not rot and Common Lisp applications are famous as having the most time without getting rot among other programming languages.


React code does not rot? How so?


There are no significant changes in architecture AFAIK. I used it just for fun a long time ago and may not know everything.


I think looking only at technologies is the wrong approach.

The underlying concepts stay the same. Your own experience is much improved.

Some old technology gets replaced? I celebrate the demise of Flash.

I am not Flash or even a Flash developer. I am a software developer.

I am a bit sad Dlang is not more used? Yes, I am. But I am not a "Dlang" developer.

Tying yourself to some technology seems self limiting.


This might ring true for most jobs and especially for anything business related. If you go lower on the library, tooling or infrastructure side things tend to live on much longer, especially if it's simple but necessary functionality. Code doesn't automatically become technical dept just because it becomes older.


> Fast forward to today, and MVC has since fallen out of fashion. Everything is now done in React, Angular, Vue, and other frameworks.

I wouldn't be so sure about that. Fallen out of fashion perhaps, but still very stable, easy work with, predictable and with pretty clear best practices that don't change every three months.


Absolutely, that's why I'm trying to move to management, to leverage on my non tech experience. I'm 40 and younger colleagues are faster learner if not already "expert" on new stuff. I often have to "forget" what and how I learned old things to learn new techs, the effort can easily double.


> One of the biggest challenges we had at Stackify was getting stuck on an old version of Elasticsearch. At one point, they made some significant changes to how it worked that were not entirely backward compatible.

I am glad I am not the only one who hit this roadblock with Elasticsearch. It felt like always trying to play catch up.


Paraphrasing... "There are only two kinds of code: the ones people complain about and the ones nobody uses"


As long as you know how to get out of technical debt (islands too) when you see one, you're suited for any task.


The point this piece is missing is, the defining tech as how marketing defines product. Customer needs evolve and products need to change to cater to that. It is just evolution. The work that under powers the solutions (C Programming language for instance), hasn't changed.


By the way (not to diss author's involvement in windows-related technologies) I am very happy to still use many of the inventions of Mr Ken Thompson that did not become technical debt and it's unlikely they will (apart from being rewritten/transformed).


Technical debt is something one have to "repay" eventually, thus the naming. "Making technical debt" is to do something in a quick but hacky way with the intention to re-do later and better. Almost never happens, by the way.


I've been focusing lately on stepping off the new and shiny treadmill. While the kids and startups are chasing that new hotness, I can be incredibly productive in less fashionable tools that I already know.


All 5 years of my web software development, I've ported code to Angular. I've read your legacy code, and ported it to a framework that's already in decline. Of course it's all legacy.


So what are the current languages and frameworks that are least likely to soon become “technical debt”? We use Python, Django, Node JS, Next JS, React JS. How do those fare?


The Strong Bad clip at the end just transported me back 15 years.


20 years ago I was mostly writing C++ for work. I still am.


> Developers jump ship quickly

this is probably the most damning thing about the field. We don't invest in developers, so why would they invest in the business?


The dude equivocates programming in Perl to programming in FoxPro, both are simply "deprecated" and "hard to find." Is this true?


Perl is not so much deprecated, more like the community performed collective jump off the cliff with Perl 6.

If you think Python 2->3 transition was ugly, you haven't been watching that slow-mo trainwreck.


Wait, wasn't Perl 6 rebranded as new language Raku?


I believe so, after years of being vapourware and basically stopping Perl 5 development right when web as a platform started being important.


Perl 6 did get released, with that name, after years stuck in development (source: https://developers.slashdot.org/story/15/12/26/0354235/perl-...). Only later was it rebranded.


In 2019 to be precise: https://raku.org


The guy way overused "deprecated" in that article.

Perl 5 is alive and well, and while not the most actively developed of languages still has a steady trickle of small improvements.

It's installable in all major linux distros and you can probably get code from 2 decades ago running without much hassle.

Most people would shudder at the thought of starting a new project in it, but if you need to keep a legacy codebase going the community have your back for many many years to come.


Use jQuery. Let others jump on these speeding, then derailing trains and collect their broken bodies at the crash sites years later.


Similar vibes, looking back on my CV; now I'm the "grumpy old man" shaking his fist at reactive programming and VIPER…


I use an open-source version of ColdFusion at the company I work for that was founded 10 years ago.

I mean, I don't like it but...


Don't be sad because your code is old. Smile because you got to write code. -- Me (paraphrasing Dr. Seuss)


The guy is still coding for money, right? So clearly he managed to absorb something that's still valuable.


Interesting article. I wonder how, say, builders or architects feel when a building of theirs is demolished.


it’s paradoxical but the opposite might be true as well… code you’d be happy to hear was replaced because you were not particularly proud of it, turns out to live on for a long time having people maintaining it without understanding it very well long after you’ve left the organization


I mean unless the human as a species stop growing, all of your career will be deprecated one day, right?


that is true for front end dev. not so much for back end. SQL, Oracle, Sql Server, Teradata, Data Warehousing are still used after more than 20 years. Skills are easily transferable to Snowflake, Big Query, Azure ...(whatever it's called at this moment)


Not anymore, now we have JavaScript and that stuff will simply run until the apocalypse.


Some of the stuff I’ve worked on (Nokia phones) have been museum pieces for some time


My 20+ year career is technical debt, deprecated, or parts of governmental standards.


Will Rust succeed in eventually making all C (and even C++) code technical debt?


I don't think it will... C will never die.

If it does, congrats. But still so far C is for many things the prime access


I’m assuming your paychecks didn’t bounce over those 20 years so who cares?


> My entire career is now technical debt, or the code has been deprecated.

Contrary to the author, for me, over my entire career, I feel like I have been relatively incredibly lucky with my choices of 1) What (web) technologies to invest lots of time into, and 2) Therefore what tools I use in my own projects and at companies I have worked for.

I think that 50% of the reason is down to when I was born and therefore when I started my software engineering career (for example, therefore avoiding lots of bad alleyways that web dev went down), and 50% was just common sense (realizing early-on that X tool was obviously going to be superior to Y tool).

1. I was fortunate enough to be born at a time such that I just about missed the AngularJS --> Angular2 betrayal + debacle.

2. I realized, through playing around with the Angular2 beta around 2014 (IIRC), that it was going to be inferior to React. I remember that at the time React was this tiny thing that had existed for around a year or so, but it was clearly conceptually superior, with more talented Facebook developers behind it vs the Google developers behind the early Angular2.

Not to be too disparaging or direspectful, but at least in the early days, it felt like Angular2 was made by a team of your usual C++ Google engineers who didn't have a lot of experience with web dev. Just being honest here...

This lead me to A) selecting React for my personal projects, and B) advising to pick React at companies I worked for. This meant that there are React codebases that I contributed to ~8 years ago that are still active at companies today.

3. Through sheer luck, I started my true web dev career after the "random crazy insecure web technologies" era such as Flash, Java applets, and all that nonsense.

4. Through sheer luck, I started my true web dev career shortly after Typescript was created, and I just happened to use a framework (now non-existent) which used it, co-introducing it to me in ~2014. This meant that I have been able to create relatively maintainable Typescript codebases, mostly avoiding creating any quick-to-deprecate JS mess-heaps.

There are a couple more examples, but I think that captures why I feel so lucky. I feel bad for the engineers that, through a combination of bad luck (time born, time entering the field, etc.) and lack of foresight, were led down paths causing them to invest a tonne of time into technologies that were just never going to be around for the long-run.

My time will likely come where my luck runs out, and I end up investing a tonne of time into some nonsense tech that initially seems solid but ends up rubbish for whatever reason, but for now, I'm quite pleased :)


I'm a freelancer in data engineering / machine learning engineer. I've so far never held a role that I had the technical skills for (well... I do have the underlying knowledge, but not the particular frameworks/tools), because every time my freelance project ends and I'm looking for the next one, the tools have already changed to something else that is hip. Or the go-to architecture is now a different one. Or the way you deploy it all is completely different.

It's pretty frustrating. I keep trying to do courses to keep my skills up-to-date, but it doesn't matter. Those courses by themselves are already outdated a year later. The only things that have been solid in my career, is the CS knowledge and theoretical knowledge from books like 'designing data intensive applications'. That stuff rarely changes. The rest is like waves in a sea. They come and go.


Today’s best practice is tomorrow’s technical debt.

Paint that shed.


After 20 years (for me) Java is still kind of alive


Flaaash! shouting in Will Shatner voice


weird, tons of my 20 year old code is still around, but my 30 year old code, except for embedded stuff, is likely dead.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: